Now that we know our goal (a chatbot powered by LLMs), the first thing we need is an environment to run our AI models. Weβll use Google Cloud Platform (GCP) and its Vertex AI service.
chatbot-project.Vertex AI API.Our app needs permission to use Vertex AI. For this, we use a Service Account.
vertex-chatbot..json file will be downloaded β keep this safe! (weβll use it in PyCharm).~/chatbot/key.json).We now have Google Cloud + Vertex AI ready. In the next step, weβll connect our local Streamlit app to Vertex AI using the API key / JSON credentials and make our first chatbot request.