For an existing project, a customer requested a new feature for a marketing campaign they were about to launch: in their ERP system, they maintain a huge list of projects with many properties associated with them and a high frequency of data change. Both, existing projects get updated, and new projects get added to the list. Those projects should be promoted on their Drupal site, but of course, there is no point in maintaining that data manually or even twice.

To make this a bit more challenging, the data should not only be displayed in a view. That would have been possible with a secondary database configured for the Drupal site and telling the Views to query that external database for the content. Instead, each of the projects should also come with a landing page displaying more details and a contact form so that visitors can easily submit their interest in it.

There’s only one reasonable approach for this: each project in the remote database needs to become a content entity in Drupal.

Not enough of a challenge yet? Then let me tell you that the external database is MS-SQL hosted on Azure. As if all that Microsoft infrastructure and tooling was what we beloved Drupal engineers wanted to get close to. But that’s a different story.

Defining the solution stack

As the main requirement is about content synchronization and creating entities in Drupal, there is only one answer to that: Drupal core’s migrate module. This may come as a surprise to some, but that set of modules in not only capable for one-off migrations from e.g. Drupal 7 to 9. Migrate is an excellent tool to read data from an external source, process the data and store it in Drupal – not only once but as often as needed, and it contains all required mechanisms to recognize existing and maybe updated data records. To mention that it also includes a solid rollback feature is only to make the picture complete, not because it’s required for this current project.

That left us with the requirement to talk to an MS-SQL database and, once resolved, to let migrate accept an SQL database as a data source for all its magic. Following the old saying “There is a module for that”, we’ve been lucky to found one for each of the two requirements:

Equipped with those 3 components, implementation turned out to be extremely simple.

Implementation steps

Once we’ve downloaded and enabled the extra modules, we first defined the extra database in the existing settings.php file of the Drupal site:

$databases['azure']['default'] = [
  'database' => 'customer365projects',
  'username' => 'DrupalServiceUser',
  'password' => 'PASSWORD',
  'prefix' => '',
  'host' => '',
  'port' => '1433',
  'namespace' => 'Drupal\\sqlsrv\\Driver\\Database\\sqlsrv',
  'autoload' => 'modules/contrib/sqlsrv/src/Driver/Database/sqlsrv',
  'driver' => 'sqlsrv',

That’s all you need to do, assuming that the database host is accessible from the Drupal host over TCP/IP. If that’s not the case, which is very likely, then your IT has to establish a tunnel with either SSH or a VPN from the Drupal host to the database host. But that’s beyond the scope of this blog post.

From now on, your Drupal site can “talk” to that MS-SQL database in Azure. Not that this was needed for our project, but to give an idea what you’d have to do if you wanted to write custom code to read and/or write from or to that database:

$database = \Drupal\Core\Database\Database::getConnection('default', 'azure');
$query = $database->select('...');

The final step for us was to configure the migrate module to import data from this database. We’ve created a migrate configuration like this:

id: projects
label: Projects
migration_dependencies: { }
  plugin: custom_sql_query
  key: azure
    - id
  sql_query: "SELECT *
        FROM projects p
                 INNER JOIN extras e
                            ON =
        WHERE p.status="for-drupal""
  plugin: entity:node
  default_bundle: project
  title: project
  updated: timestamp
  field_a: property_a
  field_b: property_b
  field_c: property_c

This is a simplified version of the configuration to show the important building blocks of this solution. Of course, you can facilitate all of migrates power to process data and store it into Drupal’s default database.

The key settings to talk to our external database in this configuration are the source plugin, key and sql_query. The plugin custom_sql_query comes from the installed module, the key azure refers to the key we’ve used in the settings for that database and the SQL query is totally up to you and depends on the data model of your external database.

So, what’s next?

drush migrate:import projects

And you’re done. Really!


A pretty complex set of requirements gets resolved with just two contribution modules from and two configurations in Drupal settings and for Drupal core’s migration module. No custom code required. And yet, the customer now can run the migrate:import command with a system crontab every 5 minutes. All their project data from their internal ERP system shows up on the Drupal site, always accurate and always up-to-date. It doesn’t get any better.

Source link