Building a static directory site with AI tools

(poolnear.me)

2 points | by swlevy 11 hours ago ago

1 comments

  • swlevy 11 hours ago

    Had a little time the past month, so decided to put together a directory site for swimming pools (yeah I know the season is over, but with my kids it’s something I’ve wanted each spring/summer): https://poolnear.me .

    …it’s been a while since I’ve built a personal-class website, so instead of just hacking it for the Bay Area, I figured I’d use it as a testbed to see how quickly one could build something like this for the whole country leveraging AI tools.

    Gemini via api (summarizing reviews, figuring out pool temperatures, etc.)

    Code assistance via copilot (Sonnet 4.5); I realize better options are out there but being a side project also leveraged a “Codespace” on Github with one-click web-based VSCode. This means I could work on the project for a few minutes here or there on virtually any computer (even tablet), which was super cool.

    The trickiest part proved to be geospatial logic - breaking the country down in a logical way - but also not incurring ridiculous api costs for proprietary data. Sonnet did an impressively good job helping me leverage Uber’s H3 library to stitch together government CBSA data (metro area boundaries) with google places api calls.

    OpenAI / ChatGPT for static image generation. Didn’t even bother wiring this up via API as I had limited needs (some icons / chips for top-level locales / etc.)

    *Github actions leveraged to deploy the site - completely static - to Amazon S3, after it’s generated inside the Codespaces / VSCode IDE. Even the autocomplete is accomplished via sharding search values into a manageable set of files the frontend can directly request; limits functionality but zero ongoing maintenance/monitoring required unless I actually want to make updates. Bonus: if you use Amazon’s Cloudfront cache in front of S3, you can get free IP-based geolocation passed in the headers.

    Conclusions: all of the component parts of a site like this are now pretty quick and easy to put together using readily available tools (a dramatic shift from even five years ago). I’d still say someone who hasn’t done it before manually would struggle with “gluing the different parts together” - for example to get the deployment working you have to copy some secrets from AWS and put them into Github - of course some of the vibe-coding platforms can do this all under one roof for a few extra bucks a month. You could also get into trouble with api costs as I mentioned; it’s important to understand if your AI-generated approach is going to trigger 10 / 100 / 1000 calls to metered services.