---
title: "HiveMQ and TimescaleDB: It just works!"
published: 2026-05-01T13:57:13.000-04:00
updated: 2026-05-01T13:57:12.000-04:00
excerpt: "For IIoT engineers: a personal take on integrating HiveMQ and TimescaleDB. One Postgres connector, no custom pipeline, no extra database to run."
tags: MQTT, IoT
authors: Doug Pagnutti
---

> **TimescaleDB is now Tiger Data.**

A few years ago I was working at a plant that manufactured explosives for the oilfield. One of the most important steps in the process was measuring very precise quantities of explosive powder for every charge. The way it worked was that we (and by we, I mean machines) would dose out a specific quantity of powder, then transfer that powder to a second container to verify the weight. If the weight was good, the powder went into the charge. If the weight was bad, the line dumped the powder and started the cycle over.

It would take a little fiddling at the beginning of a shift to get it working, but then it would run … for a while. After a few hours (sometimes less) the weight check would start failing, the powder would be dumped, and the line would back up. The operator might be able to get it running again with some more fiddling, but this process was killing our production numbers.

So we contracted with a company that specialized in machine learning. Their goal would be to monitor the dosing data and iterate on the parameters before the weights went out of spec. For the POC, they wanted to run the models on their own servers and send us back recommendations.

That meant I needed a simple, secure, performant way to ship real-time data out of our SCADA system to a partner's infrastructure. I had used HiveMQ's public broker on a few personal projects, so this felt like the right time to try the enterprise version. Glad I did. Setup was easy, and I had data flowing from our servers to theirs within a couple of hours -as long as you don’t count the days spent waiting for my firewall request to get approved.

The flexibility paid off later, too. As the model evolved, the data scientists kept asking for more context. Maybe I should’ve thought of this ahead of time, but batch id (for the explosive powder) and humidity/temperature were key inputs. Every time, I added a topic and a publisher without re-writing anything on either side. A few months later, we had the whole plant running the suggested parameters. The data pipeline was so reliable, we decided not to even bother migrating their server to our plant.

I was already a fan of MQTT for its light-weight, well structured format, and during this project I became a fan of HiveMQ Enterprise for providing an MQTT broker that “just works”.

## HiveMQ → TimescaleDB

That dosing problem feels like ancient history now (7 years at the time of writing), but when I was asked about what other industrial software helps engineers tackle IIoT data, HiveMQ Enterprise is one of the first that came to mind. If you have a bunch of disparate data sources, and you're trying to collect that data in one place, HiveMQ (and MQTT more generally) is a great way to do it.

That handles transport. The other half of the question is where the data ends up. For small workloads, pointing the broker at a vanilla [Postgres database](https://www.tigerdata.com/blog/its-2026-just-use-postgres) will work great. For large workloads (which HiveMQ can handle) then use TimescaleDB. It’s built specifically for time series workloads, so it can handle the 24/7 stream of data that HiveMQ will pass to it.

Best of all, and consistent with my claim of 'it just works', HiveMQ already has a built-in PostgreSQL connector. TimescaleDB is a PostgreSQL extension, so all you have to do to get your MQTT data into a TimescaleDB hypertable is to install and configure the connector and make sure the table schema matches your MQTT message format. No modifying queries, no temporary connector tables. Just two great products working together to put data where it's useful.

The integration guide is here: [https://www.tigerdata.com/docs/integrate/data-ingestion-streaming/hivemq](https://www.tigerdata.com/docs/integrate/data-ingestion-streaming/hivemq)