Tiger Cloud: Performance, Scale, Enterprise, Free

Self-hosted products

MST

Amazon SageMaker AI is a fully managed machine learning (ML) service. With SageMaker AI, data scientists and developers can quickly and confidently build, train, and deploy ML models into a production-ready hosted environment.

This page shows you how to integrate Amazon Sagemaker with a Tiger Cloud service.

To follow the steps on this page:

Create a table in Tiger Cloud service to store model predictions generated by SageMaker.

  1. Connect to your Tiger Cloud service

    For Tiger Cloud, open an SQL editor in Tiger Cloud Console. For self-hosted TimescaleDB, use psql.

  2. For better performance and easier real-time analytics, create a hypertable

    Hypertables are Postgres tables that automatically partition your data by time. You interact with hypertables in the same way as regular Postgres tables, but with extra features that makes managing your time-series data much easier.

    CREATE TABLE model_predictions (
    time TIMESTAMPTZ NOT NULL,
    model_name TEXT NOT NULL,
    prediction DOUBLE PRECISION NOT NULL
    ) WITH (
    tsdb.hypertable
    );

    When you create a hypertable using CREATE TABLE ... WITH ..., the default partitioning column is automatically the first column with a timestamp data type. Also, TimescaleDB creates a columnstore policy that automatically converts your data to the columnstore, after an interval equal to the value of the chunk_interval, defined through compress_after in the policy. This columnar format enables fast scanning and aggregation, optimizing performance for analytical workloads while also saving significant storage space. In the columnstore conversion, hypertable chunks are compressed by up to 98%, and organized for efficient, large-scale queries.

    You can customize this policy later using alter_job. However, to change after or created_before, the compression settings, or the hypertable the policy is acting on, you must remove the columnstore policy and add a new one.

    You can also manually convert chunks in a hypertable to the columnstore.

  1. Create a SageMaker Notebook instance

    1. In Amazon SageMaker > Notebooks and Git repos, click Create Notebook instance.
    2. Follow the wizard to create a default Notebook instance.
  2. Write a Notebook script that inserts data into your Tiger Cloud service

    1. When your Notebook instance is inService, click Open JupyterLab and click conda_python3.

    2. Update the following script with your connection details, then paste it in the Notebook.

      import psycopg2
      from datetime import datetime
      def insert_prediction(model_name, prediction, host, port, user, password, dbname):
      conn = psycopg2.connect(
      host=host,
      port=port,
      user=user,
      password=password,
      dbname=dbname
      )
      cursor = conn.cursor()
      query = """
      INSERT INTO model_predictions (time, model_name, prediction)
      VALUES (%s, %s, %s);
      """
      values = (datetime.utcnow(), model_name, prediction)
      cursor.execute(query, values)
      conn.commit()
      cursor.close()
      conn.close()
      # Example usage
      insert_prediction(
      model_name="example_model",
      prediction=0.95,
      host="<host>",
      port="<port>",
      user="<user>",
      password="<password>",
      dbname="<dbname>"
      )
  3. Test your SageMaker script

    1. Run the script in your SageMaker notebook.

    2. Verify that the data is in your service

      Open an SQL editor and check the sensor_data table:

      SELECT * FROM model_predictions;

      You see something like:

      timemodel_nameprediction
      2025-02-06 16:56:34.370316+00timescale-cloud-model0.95

Now you can seamlessly integrate Amazon SageMaker with Tiger Cloud to store and analyze time-series data generated by machine learning models. You can also untegrate visualization tools like Grafana or Tableau with Tiger Cloud to create real-time dashboards of your model predictions.

Keywords

Found an issue on this page?Report an issue or Edit this page in GitHub.