Configure Generative AI (Default) Connection
    • Dark
      Light

    Configure Generative AI (Default) Connection

    • Dark
      Light

    Article Summary

    This article focuses on setting up your Test Modeller instance to utilize the Generative AI feature within the Curiosity Platform. We will delve into the specifics of Generative AI and the Language Learning Models (LLMs) that we collaborate with for our cloud and on-premise deployments.

    Default Model

    Cloud Deployment

    In the cloud version of Test Modeller, Generative AI is pre-installed and automatically activated in your workspace. Should you wish to disable this feature, please contact your Curiosity account representative.

    The cloud-based Test Modeller utilizes OpenAI's GPT-3.5. We advise you to review the OpenAI privacy policy prior to leveraging any of the Generative AI capabilities with Test Modeller.

    On-Premise Deployment

    For on-premise deployments the Generative AI capability is disabled by default. To use this functionality it will need to be enabled in the docker-compose / environment variables for the installation. 

    Test Modeller out-the-box supports Large Language Models from OpenAI and Azure. However, its flexible architecture means it can easily integrate with any homegrown LLM, or a specific LLM an organization is using within their infrastructure, depending on an organisations security policy.

    Requirements for Configuring Generative AI

    • Either an OpenAI or Azure OpenAI account.
    • An API Key for the relevant models.
    • A chat completions model (e.g. gpt-3.5-turbo-16k-0613)
    • A completion model (e.g. text-davinci-003)
    To configure the service, the following environment variables need to be set. This can be done by creating the docker-compose-custom.yml file to have the custom configuration. If you already have a docker-compose-custom.yml file with some custom configuration elements, we will simply be editing that file instead. 
    cp docker-compose-basic.yml docker-compose-custom.yml

    Inside the docker-compose-custom.yml file, the api service's environment section needs to contain the environment variables listed below. If the api service does not exist in the YAML file, it needs to be added first - the docker-compose-ad.yml file can be used as an example for that.

    Environment Variable
    OPEN_AI_SOURCE
    Default Value
    OpenAI
    Description
    OpenAI to use OpenAI, Azure to use the Azure service.
    OPEN_AI_PROXY
    Proxy URL if required to access the endpoints.
    OPEN_AI_URLhttps://api.openai.com/v1/chat/completionsChat completions API endpoint
    OPEN_AI_COMPLETIONS_URLhttps://api.openai.com/v1/completionsCompletions API endpoint.
    OPEN_AI_KEY
    API Key for the chat completions model.
    OPEN_COMPLETION_AI_KEY
    API Key for the completions model.
    OPEN_AI_CHAT_MODELgpt-3.5-turboChat model.
    OPEN_AI_COMPLETION_MODELtext-davinci-003Completions model.
    GENERATIVE_AI_ENABLEDfalseGenerative AI enabled (true / false) for the instance.
    GENERATIVE_AI_ENABLED_EMAILS
    A comma separated list of emails whom should have access to generative AI feature. If empty all users will have access.