summaryrefslogtreecommitdiff
path: root/Biz/Storybook.py
AgeCommit message (Collapse)Author
18 hoursUpdate ollama, llm-ollama, openai-python, llmBen Sima
I couldn't use llm-ollama because it required some package upgrades, so I started going down that rabbit hole and ended up 1) realizing that these packages are way out of date now, and 2) fiddling with overrides to get everything to work. I finally figured it out, the `postPatch` in ollama-python was throwing me off for like half a day. Anyway, one thing to note is that these are changing fast and I need to either move onto nixpkgs unstable for python stuff, or maintain my own builds of all of these. Not sure which is more appropriate right now. Oh and I had to fixup some logging stuff in Biz/Storybook.py because ruff started complaining about something, which is weird because I don't think the version changed? But it was easy enough to change.
2025-01-21Move test runner to Omni/Test.pyBen Sima
Like the previous commit, this matches Omni/Test.hs.
2025-01-21Move Area to Omni/App.pyBen Sima
This matches Omni/App.hs, and I'll use it in future projects.
2025-01-06Remove Python main idiom and add coding conventions to README.mdBen Sima
I realized I don't need this stupid `__main__` convention anymore because my build system always calls Python programs like `python -m main`, so I just need to have a function named `main()`. I also started adding some general coding conventions to the README and fixed a typo.
2024-12-21Add shebangs and x bit to executablesBen Sima
With run.sh, we can build and run the file in one go. This means we can also use it as an interpreter in a shebang line and properly use the Unix executable bit. This is pretty cool and gives a few advantages: running any executable file is just `exec file.hs` or even `./file.hs`, finding all executables is `fd -t x`, you don't need to specify or know an `out` name to run something, execution of a program is standardized. There is a hack to get this to work. In C and Common Lisp, `#!` is illegal syntax, so I had to use shell syntax to invoke run.sh, call it on the current file, and then exit the shell script. Meanwhile, run.sh takes the file and evals the whole thing, building and running it. As long as either `//` or `;` is a comment character in the target language, then this works. Maybe a better thing to do would be to pre-process the file and remove the `#!` before passing it to the C compiler, like [ryanmjacobs/c][1] and [tcc][2]? However this won't work in Lisp because then I can't just load the file directly into the repl, so maybe the comment hack needs to stay. [1]: https://github.com/ryanmjacobs/c/tree/master [2]: https://repo.or.cz/tinycc.git/blob/HEAD:/tccrun.c
2024-12-21Build and deploy storybookBen Sima
I put the storybook into a new Biz.nix deploy target. The idea here is that any Biz/* targets should be hosted by this one VM for simplicity. Over time I can grow this as need be, but this should work to host a few services.
2024-12-21Async end-to-end Storybook workingBen Sima
I deleted the tests because they were overspecifying the functionality. My mistake was to try and build out the objects and endpoints before the end-to-end sync thing was fully working. And then I misunderstood how to do async with HTMX, I was overcomplicating it trying to create objects and endpoints for everything instead of just focusing on the HTML that I should be generating. This all just led to a clusterfuck of code doing all the wrong things in the wrong places. So far this is much better architected. And it turns out that using image n-1 with OpenAI's create_variation function doesn't work very well anyway, so I scrapped that too; I'll have to look into different image gen services in the future.
2024-12-21Add some mock tests of the Image endpointBen Sima
These were contributed in part by gptme, thanks!
2024-12-21Manage Storybook ImagesBen Sima
This adds the Images endpoint and related functions for loading and saving images to the filesystem. In the view layer, it also loads the images asynchronously using HTMX, so the images get lazy-loaded only when they are done generating.
2024-12-21Convert Biz/Storybook.py to LudicBen Sima
This is basically a full rewrite. I ripped out Flask and rearchitected the whole thing to use fully RESTful resources and endpoints using Ludic. The UI was completely redone to use Ludic's components. I added tests for everything that I reasonably could. This is almost ready for an alpha launch. Before shipping it I still need to: 1. generate images using image n-1 applied to `openai.images.create_variation()` 2. write a nix service, get it on a VM somewhere, I'll probably provision a new VM for this 3. replace the `db` thing with a real sqlite database I only need the first one done to show it to Lia and see if she likes it, that should be completed in a day or two. Then the nix service and deployment won't take long at all. Setting up a sqlite database will be annoying, but that I can't see that actually taking more than 2 days. So max 5 days out from launching this to friends and family.
2024-12-21Implement storybook prototypeBen Sima
This paritally used gptme to create a storybook generator. The problem I ran into is that gptme doesn't do any architecting or considerations for maintainable code, or even readable code, so it just wrote a long script. I couldn't test it. Also, it didn't actually generate a 10-page story, it generated 10 separate stories. So, I ended up writing it myself and using gptme to fixup TODOs that I wrote along the way.