Blog Image

Leveling Up dFlow: New Features & Improvements in the Works

Avatar
Akhil Naidu
13 Aug, 2025
dflowwebdev

This month at dFlow, we’ve been shipping a mix of core infrastructure upgrades, developer productivity features, and UI consistency improvements.
The work was driven by two goals:

  1. Make complex infrastructure dead simple - so teams can spend more time building and less time configuring.
  2. Ensure reliability at scale - so dFlow performs flawlessly whether you’re deploying one microservice or an entire multi-server architecture.

From automated plugin installation to secure database networking with Tailnet, from Railway migration tools to real-time terminal bubble, here’s the full breakdown of what’s new, what’s improved, and what’s coming next.

New Features & Integrations

1. External Database Integrations

We’re giving you more flexibility with your data. Soon, you’ll be able to provision and manage databases from best-in-class hosted providers all from within dFlow.

First wave of integrations:

  • Neon → Managed PostgreSQL with branching support.
  • MongoDB Atlas → Fully managed MongoDB clusters.
  • Turso → Distributed SQLite databases.

Why it matters: You get the power to run databases outside of dFlow-managed servers, while still benefiting from tight integration like automatic environment variable injection into your services.

2. Tailnet-Powered Private DB Networking

We’re integrating Tailscale Tailnet so that databases and application servers can talk over a secure, private mesh network.

  • All new databases will be provisioned on dFlow-managed database servers by default.
  • They’ll be connected to application servers without public exposure.
  • Uses MagicDNS to ensure stable hostnames, even if IPs change.

This means no more managing firewall rules or public IPs everything is secured at the network layer automatically.

3. External DB Backups

We know not every database lives inside dFlow. That’s why we’re adding backup support for external databases whether hosted on Neon, Atlas, or even your own remote server.

  • Schedule encrypted backups to S3, Backblaze, or GCS.
  • Restore to the same or different database with one click.
  • Works for PostgreSQL, MySQL, and MongoDB in the first release.

4. Docker Backups via Restic

Data in your Docker volumes is just as important as your code. We’re introducing Restic-powered backups for container volumes:

  • Deduplication → Only store changed data to save space.
  • Encryption → Keep your backups secure at rest.
  • Multiple Destinations → S3, Backblaze, GCS, and more.

Developer Experience Improvements

5. Automatic Plugin Installation

If a feature or deployment requires a plugin you don’t have, dFlow will now detect and install it automatically no more manual setup steps or failed builds due to missing dependencies.

6. Terminal-Enabled Sync Bubble

The tiny “Syncing” toast notification in the bottom-right is getting an upgrade:

  • It becomes a persistent bubble.
  • Expands into a live terminal streaming logs for sync, builds, and deployments.
  • Can be minimized to just the indicator when you need more screen space.

7. Redis Queue Management

You’ll be able to purge Redis queues per server or across your entire team directly from the UI great for clearing stuck jobs or corrupted queues without SSHing in.

8. Server Details Caching in Redis

We’re caching server details in Redis for faster load times.

  • Cache refreshes automatically when empty.
  • Refresh also triggers when a new server is added.
  • Keeps the UI snappy even with large server fleets.

Infrastructure & Architecture

9. Database Migration Between Servers

Moving databases between servers will soon be a built-in feature in dFlow.

  • Works with PostgreSQL, MySQL, and MongoDB.
  • Transfers with minimal downtime.
  • Updates all dependent services automatically.

10. Railway → dFlow Migration Tools

We’re building a migration script that takes projects from Railway and recreates them in dFlow:

  • Imports env vars, service configs, and optionally data.
  • Works for cloud-hosted or self-hosted dFlow.

For ContenQL, we’ll be creating a dedicated account to house migrated projects for easier management.

11. Tailscale Machine Cleanup

When a server is deleted, its Tailscale machine entry will also be removed automatically keeping your Tailnet free of unused devices.

12. Ansible Configuration Management

We’re bringing in Ansible to standardize server provisioning. This means every server gets the same, predictable setup no matter when it’s deployed.

13. Cleanup Utilities

We’re rolling out tools to keep environments tidy:

  • Docker cleanup → Remove unused images, containers, volumes.
  • Dokku cleanup → Clear old releases, caches, and buildpacks.
  • dFlow cleanup → Purge outdated logs, temp files, and orphaned backups.

UI/UX & CMS Updates

14. Tailwind v4 Migration

We’re upgrading from Tailwind v3 → v4 for better performance and a cleaner codebase. Along the way, we’ll audit every page to ensure a consistent UI.

15. Payload CMS Upgrade

We’re upgrading to the latest Payload CMS and replacing our custom trash system with the native trash feature.

  • Less custom code.
  • More reliable deletes/restores.

16. Draft State for Architectures

You’ll soon be able to save architecture changes as drafts work in progress until you’re ready to publish them live.

Coming Up Next Month

  • Migrating the dFlow marketing website into the dFlow App itself.
  • More external DB providers like PlanetScale and Supabase.
  • Smarter background job management.
  • Advanced analytics for deployments and performance tracking.

Final Thoughts

This month’s updates are all about control and reliability more ways to manage your data, more automation for common tasks, and fewer manual steps for developers.

We’re pushing toward making dFlow the platform we want to use ourselves fast, secure, flexible, and ready for any scale.

dFlow logodFlow

dFlow simplifies cloud deployments with powerful tools for managing servers, services and domains.

© 2025 dFlow. All rights reserved.