|
1 | | -# LLM Command Snippet |
| 1 | +# AI Command Snippets |
2 | 2 |
|
3 | | -This snippet adds an `llm` command to mongosh that provides helpful suggestions or recommendations for MongoDB-related tasks. The query results are generated using Groq API by default, with an option to use other models via Ollama. |
| 3 | +> [!CAUTION] |
| 4 | +> This is an experimental, early-stage snippet that is not meant for production use. |
4 | 5 |
|
5 | | -## Prerequisites |
| 6 | +This snippet adds a suite of commands accessible with the `ai` command. This includes: |
6 | 7 |
|
7 | | -``` |
8 | | -export GROQ_API_KEY=gsk_XXXXXXX |
9 | | -mongosh |
10 | | -``` |
11 | | - |
12 | | -## Usage |
13 | | - |
14 | | -After installing the snippet, you can use the `llm` command in your MongoDB shell like this: |
15 | | - |
16 | | -```javascript |
17 | | -llm("very briefly, just the command, do not use markdown: in mongosh how to get the collections names of current db?"); |
18 | | -``` |
19 | | - |
20 | | -This will output a possible solution to your query, such as `db.getCollectionNames()`. |
21 | | - |
22 | | -```javascript |
23 | | -llm("very briefly, just the command, do not use markdown: in mongosh replace all documents of a collection with property {'set':'llm102'} with the new value {'set':'llm101'} in current db?") |
24 | | -``` |
25 | | - |
26 | | -This will output a possible solution to your query, such as `db.collection.updateMany({ set: 'llm102' }, { $set: { set: 'llm101' } })`. |
27 | | - |
28 | | -You can also specify a different model to use with an optional parameter: |
29 | | -```javascript |
30 | | -llm("Your query here", { model: "phi3.5" }); |
31 | | -``` |
32 | | -This will use the specified model (in this case, 'phi3.5') via Ollama instead of the default Groq API. |
33 | | - |
34 | | -## Models |
35 | | - |
36 | | -By default, the `llm` command uses the Groq API with the 'llama-3.1-70b-versatile' model. You can use other models by specifying them in the optional parameter: |
37 | | - |
38 | | -- Groq API (default): No need to specify, just use `llm("Your query")`. |
39 | | -- Ollama models: Specify the model name, e.g., `llm("Your query", { model: "gemini2" })` or `llm("Your query", { model: "phi3.5" })`. |
40 | | - |
41 | | -Note: When using Ollama models, make sure you have Ollama running locally on the default port (11434). |
| 8 | +| | | | |
| 9 | +|---------|-------------|---------| |
| 10 | +| `ai.ask` | Ask questions about MongoDB | `ai.ask how do I run queries in mongosh?` | |
| 11 | +| `ai.data` | Generate data-related mongosh commands | `ai.data insert some sample user info` | |
| 12 | +| `ai.query` | Generate a MongoDB query | `ai.query find documents where name = "Ada"` | |
| 13 | +| `ai.aggregate` | Generate a MongoDB aggregation | `ai.aggregate find documents where name = "Ada"` | |
| 14 | +| `ai.collection` | Set the active collection | `ai.collection("users")` | |
| 15 | +| `ai.shell` | Generate general mongosh commands | `ai.shell get sharding info` | |
| 16 | +| `ai.general` | Ask general questions to your model | `ai.general what is the meaning of life?` |
| 17 | +| `ai.config` | Configure the AI commands | `ai.config.set("provider", "ollama")` | |
42 | 18 |
|
| 19 | +This currently supports 5 different AI providers: `docs, openai | mistral | atlas | ollama` and any model they support. For cloud providers, you can specify the API key with `MONGOSH_AI_API_KEY`. |
43 | 20 |
|
44 | 21 | ## Installation |
45 | 22 |
|
46 | 23 | You can install this snippet using the `snippet` command in mongosh: |
47 | 24 |
|
48 | 25 | ```javascript |
49 | | -snippet install llm-command |
| 26 | +config.set('snippetIndexSourceURLs', |
| 27 | +'https://github.com/gagik/mongosh-snippets/raw/refs/heads/ai/index.bson.br;' |
| 28 | ++ config.get('snippetIndexSourceURLs') ) |
| 29 | +snippet install ai |
50 | 30 | ``` |
51 | 31 |
|
52 | 32 | ## Troubleshooting |
|
0 commit comments