Integrate Vertex AI with Astra DB Serverless

query_builder 15 min

Vertex AI can integrate with Astra DB Serverless using extensions to perform CRUD operations on your Serverless (Vector) databases with natural language.

The supported operations are:

  • readData

  • insertData

  • updateData

  • deleteData

Prerequisites

The steps in this guide assume the following:

Create a secret for your database credentials

  1. In Astra Portal, under Databases, navigate to your database.

  2. Ensure the database is in Active status, and then select Generate Token. In the Application Token dialog, click content_paste Copy to copy the token (e.g. AstraCS:WSnyFUhRxsrg…​). Store the token in a secure location before closing the dialog.

    Your token is automatically assigned the Database Administrator role.

  3. Copy your database’s API endpoint, located under Database Details > API Endpoint (e.g. https://ASTRA_DB_ID-ASTRA_DB_REGION.apps.astra.datastax.com).

  4. In your Google Cloud project, create a secret for your database credentials.

    1. Name your secret DATASTAX_VERTEX_AI_TOKEN.

    2. Add your database credentials, separated by semicolons, to the secret value:

      TOKEN;API_ENDPOINT;COLLECTION

      Replace the following:

      • TOKEN: Your database API endpoint.

      • API_ENDPOINT: Your database application token.

      • COLLECTION: The name of the collection or table in your database.

      Example secret value
      AstraCS:QCyMhoAQlqBtHoNoOsGPXxeU:09315c939c7d6b2a0b1d586356952bb3a7d8da38e9714a855306f26ac1a7e178;https://a71b76c8-5236-4c1e-adb2-9fdd6c85d60b-us-east-1.apps.astra.datastax.com;my_collection
  5. Grant access to the secret.

    You can grant access to your own account if you plan to register your Astra DB extension manually. Otherwise, you can grant access to a service account.

Deploy a container for your Astra DB extension

Deploy a Docker container to Cloud Run for your Astra DB extension.

  1. Authenticate your Google Cloud account in the Google Cloud CLI.

    gcloud auth login
  2. Set your project ID.

    gcloud config set project PROJECT_ID

    Replace PROJECT_ID with your Google Cloud project ID.

  3. Create a repository for the container.

    gcloud artifacts repositories create astra-api \
    --repository-format=docker \
    --location=us-central1 \
    --description="Vertex AI Extension Container for Astra DB" \
    --async
  4. Build the container image.

    docker build --platform linux/amd64 -t us-central1-docker.pkg.dev/PROJECT_ID/astra-api/astra-crud astra-crud-extension

    Replace PROJECT_ID with your Google Cloud project ID.

  5. Push the image.

    docker push us-central1-docker.pkg.dev/PROJECT_ID/astra-api/astra-crud

    Replace PROJECT_ID with your Google Cloud project ID.

  6. Register the container artifact by deploying a service in Cloud Run.

    Once deployed, you will receive a Cloud Run service URL.

  7. Download the extension.yaml file from GitHub or copy the contents from the example below.

    Expand to see extension.yaml file
    extension.yaml
    openapi: 3.1.0
    info:
      title: Astra Vertex Extension
      description: An extension to perform CRUD actions on data within your Astra Database.
      version: 1.0.0
    servers:
      - url: '[YOUR_CLOUD_RUN_SERVICE_URL]'
    paths:
      /health:
        get:
          operationId: health
          summary: A simple health check endpoint
          responses:
            '200':
              description: Successful response
              content:
                application/json:
                  schema:
                    type: object
                    properties:
                      message:
                        type: string
      /readData:
        post:
          operationId: readData
          summary: Search for data within the database using a filter
          requestBody:
            required: true
            content:
              application/json:
                schema:
                  type: object
                  properties:
                    filter:
                      type: object
                      description: Key-value pairs to filter the data
          responses:
            '200':
              description: Successful response
              content:
                application/json:
                  schema:
                    type: array
                    items:
                      type: object
      /updateData:
        post:
          operationId: updateData
          summary: Update existing data within the database
          requestBody:
            required: true
            content:
              application/json:
                schema:
                  type: object
                  properties:
                    filter:
                      type: object
                    fieldUpdate:
                      type: object
          responses:
            '200':
              description: Data updated successfully
              content:
                application/json:
                  schema:
                    type: array
                    items:
                      type: object
      /insertData:
        post:
          operationId: insertData
          summary: Insert new data into the database
          requestBody:
            required: true
            content:
              application/json:
                schema:
                  type: object
                  properties:
                    data:
                      type: array
                      items:
                        type: object
          responses:
            '200':
              description: Data inserted successfully
              content:
                application/json:
                  schema:
                    type: array
                    items:
                      type: object
      /deleteData:
        post:
          operationId: deleteData
          summary: Delete existing data within the database
          requestBody:
            required: true
            content:
              application/json:
                schema:
                  type: object
                  properties:
                    filter:
                      type: object
          responses:
            '200':
              description: Data deleted successfully
              content:
                application/json:
                  schema:
                    type: array
                    items:
                      type: object

    Open the file in your preferred editor and replace [YOUR_CLOUD_RUN_SERVICE_URL] with the Cloud Run service URL from the previous step. Save and close the file.

Create your Astra DB extension in Vertex AI

You can use the Google Cloud Console or the Vertex AI Python SDK to create your Astra DB extension in Vertex AI.

  • Google Cloud Console

  • Python SDK

  1. Navigate to your project’s Vertex AI Extensions page and click Create extension.

  2. In the Create a new extension dialog, configure the following fields:

    • Extension name: Enter a name for the extension. For example, astra_tool.

    • Description: Enter an optional description for the extension. For example, Astra DB CRUD operations.

    • API name: Enter a name for the API shown to the LLM. For example, astra_tool.

    • API description: Enter a description of the API shown to the LLM. For example, Astra DB CRUD Operations.

    • Source: Click Upload.

    • Upload OpenAPI Spec file: Click Browse and then select your extension.yaml file.

    • Authentication: Select API key.

    • Parameter name: Enter token.

    • API key secret: Enter the path to your DATASTAX_VERTEX_AI_TOKEN secret in the following format:

      projects/PROJECT_ID/secrets/DATASTAX_VERTEX_AI_TOKEN/versions/latest

      Replace PROJECT_ID with your Google Cloud project ID.

    • Http element location: Select Header.

  3. Click Create.

    Your Astra DB extension is now installed.

  1. Download and install the Vertex AI Python SDK.

    pip install --upgrade google-cloud-aiplatform
  2. Create a storage bucket.

    gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_ID --location=BUCKET_LOCATION

    Replace the following:

    • BUCKET_NAME: The name you want to give your bucket, subject to naming requirements. For example, my-bucket.

    • PROJECT_ID: Your Google Cloud project ID.

    • BUCKET_LOCATION: The location of your bucket. For example, us-east1.

  3. Copy the extension.yaml file to the storage bucket into the folder of your choice. For example:

    gsutil cp extension.yaml gs://BUCKET_NAME/EXTENSION_PATH/extension.yaml

    Replace the following:

    • BUCKET_NAME: The name of the storage bucket you created.

    • EXTENSION_PATH: The folder where you want to store the extension.yaml file.

  4. Register your Astra DB extension with the Python SDK.

    from google.cloud.aiplatform.private_preview import llm_extension
    
    PROJECT_ID = "PROJECT_ID"
    SECRET_ID = "DATASTAX_VERTEX_AI_TOKEN"
    BUCKET_NAME = "BUCKET_NAME"
    EXTENSION_PATH = "EXTENSION_PATH"
    
    extension_astra = llm_extension.Extension.create(
        display_name = "Perform a CRUD Operation on Astra DB",
        description = "Inserts, loads, updates, or deletes data from Astra DB and returns it to the user",
        manifest = {
            "name": "astra_tool",
            "description": "Access and process data from AstraDB",
            "api_spec": {
                "open_api_gcs_uri": f"gs://{BUCKET_NAME}/{EXTENSION_PATH}/extension.yaml"
            },
            "authConfig": {
                "authType": "API_KEY_AUTH",
                "apiKeyConfig": {
                    "name": "token",
                    "apiKeySecret": f"projects/{PROJECT_ID}/secrets/{SECRET_ID}/versions/1",
                    "httpElementLocation": "HTTP_IN_HEADER",
                },
            }
        },
    )

    Replace the following:

    • PROJECT_ID: Your Google Cloud project ID.

    • BUCKET_NAME: The name of your storage bucket.

    • EXTENSION_PATH: The folder where you stored the extension.yaml file.

  5. Navigate to the Vertex AI Extensions page and confirm your Astra DB extension was successfully created.

Test the integration

You can use the Google Cloud Console or the Vertex AI Python SDK to verify that the integration is working.

  • Google Cloud Console

  • Python SDK

  1. On the Vertex AI Extensions page, select your newly-created Astra DB extension.

  2. Ask your extension What information do you have from the db?.

    If the response is a list of the data in your collection, then the integration was successful.

  1. On the Vertex AI Extensions page, locate the ID of your newly-created Astra DB extension.

  2. Run the following Python code to test your Astra DB extension:

    from google.cloud.aiplatform.private_preview import llm_extension
    
    extension_astra = llm_extension.Extension('projects/PROJECT_ID/locations/us-central1/extensions/EXTENSION_ID')
    
    extension_astra.execute("hello", operation_params={})
    extension_astra.execute("readData",
        operation_params = {},
    )

    Replace the following:

    • PROJECT_ID: Your Google Cloud project ID.

    • EXTENSION_ID: The ID of your Astra DB extension.

    If the response is a list of the data in your collection, then the integration was successful.

Remove the integration

To remove the Astra DB integration:

  1. Locate your Astra DB extension on your project’s Vertex AI Extensions page.

  2. Click more_vert More, and then click Delete.

    Your Astra DB extension is deleted.

  3. Locate and delete the DATASTAX_VERTEX_AI_TOKEN secret and any associated permissions.

The Astra DB integration is removed and Vertex AI can no longer access your Astra DB database.

Was this helpful?

Give Feedback

How can we improve the documentation?

© 2024 DataStax | Privacy policy | Terms of use

Apache, Apache Cassandra, Cassandra, Apache Tomcat, Tomcat, Apache Lucene, Apache Solr, Apache Hadoop, Hadoop, Apache Pulsar, Pulsar, Apache Spark, Spark, Apache TinkerPop, TinkerPop, Apache Kafka and Kafka are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States and/or other countries. Kubernetes is the registered trademark of the Linux Foundation.

General Inquiries: +1 (650) 389-6000, info@datastax.com