← Back to Guides
Header image for blog post: How to deploy Langflow with Northflank
Daniel Adeboye
Published 3rd October 2025

How to deploy Langflow with Northflank

If you need a powerful yet simple way to build, run, and manage LLM-powered workflowsLangflow is one of the best open-source tools you can use. It provides a drag-and-drop interface for chaining together prompts, models, APIs, and logic, helping you create production-ready AI apps without writing boilerplate code.

With Northflank, you can deploy Langflow in minutes using a one-click template or set everything up manually. Northflank takes care of scaling, networking, and infrastructure while you focus on building with Langflow.

Prerequisite

Before you begin, create a Northflank account.

What this tutorial covers

  1. Deploying Langflow with a one-click template on Northflank
  2. Deploying Langflow manually on Northflank

What is Northflank?

Northflank is a developer platform that makes it easy to build, deploy, and scale applications, databases, jobs, and even GPU workloads. It abstracts Kubernetes with smart defaults, giving you production-ready deployments without losing flexibility.

Option 1: Deploy Langflow with a one-click template

You can launch Langflow on Northflank in just a few minutes using the ready-made template. This option is ideal if you want to quickly spin up Langflow or demo the platform without doing a manual setup.

image (111).png

Template overview

The Langflow deployment on Northflank includes:

  • 1 addon: PostgreSQL (for storing workflows and configs)
  • 1 secret group for managing environment variables (API keys, database credentials)
  • Deployment of Langflow from the Docker image: langflowai/langflow:latest

Getting started

  1. Visit the Langflow template on Northflank.
  2. Click “Deploy”.
  3. Northflank will automatically:
    • Create a project, database, secret group, and service
    • Deploy Langflow with the required configuration
    • Expose a public URL for your app
  4. Once live, open the public URL to access the Langflow builder UI in your browser.

Note: You’ll still need to add API keys for your LLM providers (e.g., OpenAI, Anthropic, Hugging Face).

Option 2: Deploy Langflow manually on Northflank

If you want more flexibility or need to customize your setup, you can deploy Langflow manually. This approach provides you with complete control over configuration and integration.

Note: You can also customise Northflank's one-click deploy templates.

Step 1: Create a Northflank project

Log in to your Northflank dashboard, click the “Create new” button (+ icon) in the top right corner of your dashboard. Then, select “Project” from the dropdown.

image - 2025-10-03T102800.569.png

Projects serve as workspaces that group together related services, making it easier to manage multiple workloads and their associated resources.

Step 2: Configure your project

You’ll need to fill out a few details before moving forward.

  • Enter a project name, such as langflow-project and optionally pick a color for quick identification in your dashboard.

  • Select Northflank Cloud as the deployment target. This uses Northflank’s fully managed infrastructure, so you do not need to worry about Kubernetes setup or scaling.

    (Optional) If you prefer to run on your own infrastructure, you can select Bring Your Own Cloud and connect AWS, GCP, Azure, or on-prem resources.

  • Choose a region closest to your users to minimize latency.

  • Click Create project to finalize the setup.

image - 2025-10-03T102803.645.png

Step 3: Create a PostgreSQL database

Inside your project, go to the Addons tab and click “Create new addon.” Then, select PostgreSQL as the Addon type, give it a descriptive name such as langflow-db. Select your preferred version, and choose a compute plan size.

  • If you’re testing or experimenting, the smallest option is cost-effective and sufficient.
  • For production, we recommend starting with nf-compute-50. This provides more resources and stability, ensuring Langflow runs reliably under real workloads.

Once you’ve configured the settings, click Create addon to provision your database.

image - 2025-10-03T102805.775.png

Step 4: Create a Secret group to store environment variables

Next, navigate to the Secrets tab and click "Create Secret Group." Name it something easy to recognize, such as langflow-secrets. This group will hold all the environment variables required by Langflow. You can find the full list of supported variables in the Langflow docs.

image - 2025-10-03T102807.719.png

If you don't want to go through the stress of manually configuring or searching for environment variables to use, you can use the already configured ones below:

LANGFLOW_PORT="8000"
LANGFLOW_LOG_LEVEL="debug"
LANGFLOW_SECRET_KEY= {Update this}

Notes about these values:

  • LANGFLOW_PORT: This sets the internal port Langflow will run on (8000 is the default).
  • LANGFLOW_LOG_LEVEL: Set to "debug" for detailed logs during setup and troubleshooting.
  • LANGFLOW_SECRET_KEY: Required for security - generate a strong random string here.

Link the PostgreSQL addon to your secret group

Northflank addons (like PostgreSQL) expose connection details (username, password, etc.). Instead of manually copying these values into your secrets, you can link the addon directly to your secret group. This way, whenever the addon rotates credentials, your service automatically receives the updated values.

  • Under the Linked addons section, click “Configure” to map POSTGRES_URI to the alias LANGFLOW_DATABASE_URL.

Here we’re not inventing new values, we’re just telling Northflank to feed the Postgres URI into the environment variable names your service requires.

image - 2025-10-03T102919.835.png

Finally, click Create secret group to save everything.

Step 5: Create a deployment service

Within your project, navigate to the Services tab in the top menu and click ’Create New Service’. Select Deployment and give your service a name such as langflow-app.

For the deployment source, choose External image and enter the official Langflow Docker image: langflowai/langflow:latest.

image - 2025-10-03T102811.536.png

Select compute resources

Choose the compute size that best matches your workload:

  • Small plans are fine for testing or lightweight usage.
  • Larger plans are recommended for production, as Langflow can be resource-intensive under real-world traffic.

The flexibility to adjust resources later means you can start small and scale up as needed.

image - 2025-10-03T102700.726.png

Set up a port so your app is accessible:

  • Port: 8000
  • Protocol: HTTP
  • Public access: enable this to let people access your app from the internet

Northflank will automatically generate a secure, unique public URL for your service. This saves you from having to manage DNS or SSL certificates manually.

image - 2025-10-03T102817.164.png

Deploy your service

When you’re satisfied with your settings, click “Create service.” Northflank will pull the image, provision resources, and deploy Langflow.

Once the deployment is successful, you’ll see your service’s public URL at the top right corner, e.g.: p01--langflow-app--lppg6t2b6kzf.code.run

image - 2025-10-03T102819.322.png

Conclusion

Deploying Langflow on Northflank gives you a production-ready way to run your AI workflow builder without worrying about Kubernetes, networking, or scaling.

Whether you choose the one-click template for speed or the manual setup for full control, Northflank provides the infrastructure while Langflow powers your LLM-driven apps.

Together, they let you prototype, deploy, and scale AI workflows all from a simple, developer-friendly platform.

Share this article with your network
X