Michael Ruminer
2 min readJul 29, 2024
AI generated crew image. Note that some folks are backwards. :-)

This weekend’s AI adventures was into agents, crewAI specifically. I learned a lot, and it made me even more eager to dig deeper.

One of the YouTube videos I watched was a line-by-line Python walkthrough of a crewAI example. I think the example was taken from the official docs/examples but the line-by-line presentation was invaluable. It was amazingly and relatively easy to set up a three task and three agent process along with some tools for it to run with. I was impressed. I plan to play around with the task and agent configurations to see the output. The one downside is that using OpenAI GPT4 may become prohibitively expensive in making lots of experimental calls, plus the cost of crewAI itself.

I don’t yet have a feel for the cost for such usage and each run may vary not only due to configuration but because the agents might run more iterations one time than they do another. It depends on when they think they are done. I have seen that you can run Llama locally and even videos on how to connect crewAI to local instances of Llama but I am not sure that will meet my experimentation needs. I have to first play around with Llama to know that.

Another consideration is if I do this on my PC with it’s 16GB of RAM and measly Nvidia GTX 1060 card or do it on my Macbook with 32 GB of RAM but I am not sure what the GPU specs are. Some more research is in order and perhaps side by side comparisons.

I’ll report back on my general findings.

Michael Ruminer

My most recent posts are on AI, especially from the perspective of a, currently, non-AI tech worker. did:web:manicprogrammer.github.io