HashiCorp AI - Part 1

Coming soon: Baked-in intelligence to the HashiCorp developer portal.

This is an intentionally short and scattered, late project stage retrospective note-dump at building and delivering an AI-powered feature, fully end-to-end. I’ll occasionally come back to make revisions here.

9/5

Background

  • ChatGPT was released and it was mind blowing.
  • Supabase rolled out Ask Supabase AI.
  • Other companies followed suit
  • There was an explosion of AI around May 2023.
  • I wrote an initial opinion document. Leadership gave the greenlight. The project kicked off.

The feature

This is a look at the feature.

The fun parts

  • The experimental nature. No claims to accuracy, or dire need to get it right from the get go
  • Greenfield backend, from scratch end-to-end
    • AWS AppRunner’s auto-deploy on container image update is a huge win for rapid backend iteration
    • Kysely's type-safe SQL queries via TypeScript
  • Data-modeling - Reverse engineering ChatGPT's conversations
  • Technical challenges
    • Rate-limiting
    • Authentication and Authorization via an external team's identify service.
  • Getting the feature in front of users — https://dev-portal-git-kwai-hashicorp.vercel.app/
  • Balancing pragmatism with excitement around the newness of AI
  • Keeping things simple
  • Joining minds with cross-org initiatives around AI like the Terraform team

The hard parts

  • Estimating project timeline
  • Estimating costs
  • Keeping costs at bay
  • Learning new patterns and tech
    • This is generally fun, but requires stepping out of immediate comfort zone
      • Server Sent Events were new to me
      • Streaming, or working firsthand with streaming, was generally new to me
    • Prompt engineering and not being sure if certain adjustments would improve response quality
  • Wearing all the hats — engineering, a tiny bit of design, product, project management
  • The newness of AI in the industry — the lack of established patterns and practices makes success metrics not straightforwards to justify
  • Keeping technical complexity low. This includes knowing when to omit or introduce new tech.
    • ✅ OpenAI V4 Node SDK
    • ✅ PineconeDB SDK
    • 🚫 Vercel AI SDK
    • 🚫 Langchain

The sloggy parts

It is a meaningful skill to be able to navigate these waters.
  • Large organization beaurocracy and politics
    This should not be misconstrued as bad-mouthing or as a negative thing. It is a reality of the state of large organizations.
  • Getting procurement approval — <Insert Futurama-Fry "take my money" meme>
  • Getting legal approval
    • The newness of AI in the industry leaves legal matters in a grey area
  • Getting security approval
  • Getting compliance approval; Onboarding third-party services
    • PineconeDB
    • OpenAI
    • Azure OpenAI service

Learnings

  • Do work in the open (at least internally)
  • AI is not magic.
  • AI tools are rapidly becoming integrated into daily life.
    • Expectations will shift very quickly.
  • Azure OpenAI is mostly a drop-in replacement for OpenAI.
    • Easier to procure and get approval for, if your org already has Azure usage
  • Use OpenAI to convert text to vectors
  • Store vectors in a vector database; use the vector database for similarity search
  • OpenAI Docs are a good standard
  • Prompt engineering tactics — document citation.

Todos

  • Fine tuning
  • Tactics for reducing token usage when conversations reach longer lengths

Want early access?

Reach out and I can get you added to the early access list.

Comments