When setting up both Content Hub 2, and when using Content Hub 1 with setting to export Content Hub content with drush, you need to set up a cron job that handles that task for you.
The best thing to do is to ssh to the instance and practice running the drush command a few times with all of the flags set to make sure that you are getting the result you want before adding it to your Scheduled Jobs as a task to be triggered with cron.
It helps to get the name of the queue that you need to run. The simplest way to do this is with
drush9 queue:list
You should get a list of queues and the number of items:
------------------------------------------ ------- ---------------------------------
Queue Items Class
------------------------------------------ ------- ---------------------------------
acquia_contenthub_export_queue 646 Drupal\Core\Queue\DatabaseQueue
acquia_contenthub_import_queue 0 Drupal\Core\Queue\DatabaseQueue
locale_translation 0 Drupal\Core\Queue\DatabaseQueue
media_entity_thumbnail 0 Drupal\Core\Queue\DatabaseQueue
seasonal_sort_queue 0 Drupal\Core\Queue\DatabaseQueue
------------------------------------------ ------- ---------------------------------
To export content to Content Hub use the drush queue-run command, and the name of your export queue, in this case that's acquia_contenthub_export_queue . The -v gives you verbose loggging, and -l runs the queue for the publisher site. Be sure to use the https:// and the fully qualified domain name for the publishing site as this command sets all paths that are used for images, and some other types of content.
drush -v -l [https://publisher-site.com|https://publisher-site.com/] -root=/var/www/html/<account>.<env>/docroot queue-run acquia_contenthub_export_queue
If you'd like to add logging to this command to pipe that output to a log file use, append a logging string:
drush -v -l [https://publisher-site.com|https://publisher-site.com/] -root=/var/www/html/<account>.<env>/docroot queue-run acquia_contenthub_export_queue &>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/ach-ex0-`date +%F`.log
There may be times that you want to use flock on cron to prevent cronjobs from overlapping this is what this command does:
flock -xn /tmp/ach-exp0.lck -c "drush -v -l [https://publisher-site.com|https://publisher-site.com/] -root=/var/www/html/<account>.<env>/docroot queue-run acquia_contenthub_publish_export" &>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/ach-ex0-`date +%F`.log
If you are using Content Hub Syndication, and you need to set up an scheduled job to import content in a scheduled (cron) job, this is the format of that command:
It includes flock, and setting verbose logging, and setting the path for the import site.
flock -xn /tmp/ach-imp1.lck -c "drush -v -l [https://subscriber-site.com|https://subscriber-site.com/] -root=/var/www/html/<account>.<env>/docroot queue-run acquia_contenthub_import_queue " &>> /var/log/sites/${AH_SITE_NAME}/logs/$(hostname -s)/ach-i1-`date +%F`.log
Please reach out in a Support ticket if you have additional questions.