Why CodeSumma? #3
ryanmac
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I build little ChatGPT-assisted POCs every week or two.
The most tedious part is summarizing code for ChatGPT, giving it the right context, and under the token limit.
CodeSumma to make it easier.
It recursively summarizes files into something similar to a Smart Contract ABI. If can also summarize it using OpenAI API’s cost-effective-but-sufficient
text-davinci-002model. Then it summarizes everything under a token limit for easy pasting into ChatGPT.Here’s why it’s cool:
--tracebackto paste in the error, and it will be thoughtful to include that in the output.--max-tokens-outto limit the amount of tokens to fit ChatGPT’s limits (or smaller, if you’d like to have space to write a new feature request).--copyto fill your clipboard to paste to ChatGPT as quick context to your prompt.--print-fullto print full code for a file, but summarize the rest.--manualto prompt through all of the above.Examples:
cs -cpto summarize the cwd.cs -i tests readme -o 4096to summarize the cwd and keep the output under 4096 tokens.cs https://github.com/ryanmac/CodeSumma -o 1000to get a quick code summary of a remote repo.cs -cpto summarize the cwd.cs -i tests readme -o 4096to summarize the cwd, ignoring the tests and readme files, and keeping the output under 4096 tokens.cs https://github.com/ryanmac/CodeSumma -o 1000to get a quick code summary of a remote repo under 1000 tokens.cs -mwhen I need to paste in a traceback.Ultimately, I’m looking to use this to build an AI agent for Auto-GPT or a ChatGPT Plugin to make it easier to summarize code (under a token limit) for better context to assist.
Contributions welcome.
Beta Was this translation helpful? Give feedback.
All reactions