Docker Deployment
Docker allows applications to run inside lightweight containers that bundle the application and all required dependencies.
The Nife platform enables users to deploy applications directly from Docker images, allowing developers to quickly launch scalable containerized workloads.
This guide walks through the complete process of deploying a Docker image using the Nife dashboard.
Step 1: Open the Applications Dashboard
Navigate to Applications from the left sidebar of the Nife dashboard.
The Applications page shows all currently deployed applications along with their status, regions, replicas, and deployment strategies.
From this page you can monitor running services or create a new deployment.

Click Deploy App to begin deploying a new application.
Step 2: Choose the Deployment Type
After clicking Deploy App, the Quick Deploy page appears.
Here you can choose what type of workload you want to deploy.
Since Docker images run as containerized workloads, select Application.

This opens the Deploy Application wizard, which guides you through the deployment process step-by-step.
Step 3: Configure Application Source
The first stage in the wizard is Source Configuration.
Here you define basic application information and the container image that will be deployed.
Basic Information
Provide the following information:
Organization
Select the organization where the application will be deployed.
Application Name
Enter a unique name for the application.
Example:
my-awesome-app
Docker Image
Select Docker Image as the source type.
Enter the container image you want to deploy.
Example:
nginx:latest
You can also click Verify Image to check whether the image exists in the container registry and retrieve metadata before deployment.
Step 4: Configure Build Settings
In this step you define how traffic reaches the container.

Ports
Ports define how network traffic is routed.
Example configuration:
| Setting | Value |
|---|---|
| Internal Port | 3000 |
| External Port | 80 |
Internal Port → Port exposed inside the container
External Port → Port exposed externally to users
Environment Variables
Environment variables allow runtime configuration of the container.
Example:
NODE_ENV=production
These variables are often used for configuration values such as environment mode, API keys, or service endpoints.
Step 5: Configure Resources
This step defines compute resources and deployment behavior.
Resource Type
Choose the compute type depending on the workload requirements.
Available options include:
- CPU
- GPU
- TPU
- FPGA
- Edge AI
- Serverless
Most standard applications use CPU resources.
Resource Requests (Minimum)
Resource requests define the minimum guaranteed resources allocated to the container.
Example configuration:
| Resource | Value |
|---|---|
| CPU Request | 250m |
| Memory Request | 512Mi |
Resource Limits (Maximum)
Resource limits define the maximum resources the container is allowed to consume.
| Resource | Value |
|---|---|
| CPU Limit | 500m |
| Memory Limit | 1Gi |
This prevents containers from consuming excessive infrastructure resources.
Deployment Strategy
Nife supports several deployment strategies:
- Rolling – Gradually replaces old containers with new ones.
- Blue-Green – Maintains two environments for safer updates.
- Canary – Gradually shifts traffic to new versions.
- Recreate – Stops old containers before starting new ones.
Step 6: Security Checks and Review
Before deployment begins, the Nife platform runs automated security and configuration checks.
Unified Security Pipeline
The platform performs multiple checks such as:
- Container vulnerability scanning
- Infrastructure configuration validation
- Secret detection
These checks help prevent insecure deployments and configuration errors.
Step 7: Deploy the Application
After reviewing the configuration, click Deploy.
The Nife platform will then:
- Pull the Docker image from the registry
- Allocate compute resources
- Create and start containers
- Apply deployment strategies
- Launch the application
Once deployment is complete, the application appears in the Applications Dashboard, where you can monitor its health, view logs, and access the deployed service URL.
This workflow enables developers to deploy containerized applications quickly while benefiting from automated infrastructure management and security validation.