summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authoricebaker <icebaker@proton.me>2024-01-06 22:09:23 -0300
committericebaker <icebaker@proton.me>2024-01-06 22:09:23 -0300
commitc4807b26f0d530ef99ff87b6c5c45a4953ba958a (patch)
tree8d2709d35089ec8afb60bd20c6855de4623e3d1b
parentbfe0e76e3683a71bb8ce5bfdaae99b0252e7be05 (diff)
adding new providers
-rw-r--r--Gemfile.lock22
-rw-r--r--README.md917
-rw-r--r--components/provider.rb14
-rw-r--r--components/providers/cohere.rb2
-rw-r--r--components/providers/maritaca.rb113
-rw-r--r--components/providers/ollama.rb132
-rw-r--r--logic/cartridge/streaming.rb4
-rw-r--r--logic/providers/maritaca/tokens.rb14
-rw-r--r--logic/providers/ollama/tokens.rb14
-rw-r--r--nano-bots.gemspec11
-rw-r--r--spec/data/cartridges/models/maritaca/maritalk.yml12
-rw-r--r--spec/data/cartridges/models/ollama/llama2.yml10
-rw-r--r--spec/tasks/run-model.rb39
-rw-r--r--static/gem.rb8
14 files changed, 962 insertions, 350 deletions
diff --git a/Gemfile.lock b/Gemfile.lock
index be5f77f..2884e99 100644
--- a/Gemfile.lock
+++ b/Gemfile.lock
@@ -1,13 +1,15 @@
PATH
remote: .
specs:
- nano-bots (2.4.1)
+ nano-bots (2.5.0)
babosa (~> 2.0)
cohere-ai (~> 1.0, >= 1.0.1)
concurrent-ruby (~> 1.2, >= 1.2.2)
dotenv (~> 2.8, >= 2.8.1)
- gemini-ai (~> 3.1)
+ gemini-ai (~> 3.1, >= 3.1.2)
+ maritaca-ai (~> 1.0)
mistral-ai (~> 1.1)
+ ollama-ai (~> 1.0)
pry (~> 0.14.2)
rainbow (~> 3.1, >= 3.1.1)
rbnacl (~> 7.1, >= 7.1.1)
@@ -38,10 +40,10 @@ GEM
multipart-post (~> 2)
faraday-net_http (3.0.2)
ffi (1.16.3)
- gemini-ai (3.1.0)
+ gemini-ai (3.1.2)
event_stream_parser (~> 1.0)
faraday (~> 2.8, >= 2.8.1)
- googleauth (~> 1.9, >= 1.9.1)
+ googleauth (~> 1.8)
google-cloud-env (2.1.0)
faraday (>= 1.0, < 3.a)
googleauth (1.9.1)
@@ -54,15 +56,19 @@ GEM
json (2.7.1)
jwt (2.7.1)
language_server-protocol (3.17.0.3)
+ maritaca-ai (1.0.0)
+ faraday (~> 2.8, >= 2.8.1)
method_source (1.0.0)
mistral-ai (1.1.0)
event_stream_parser (~> 1.0)
faraday (~> 2.8, >= 2.8.1)
multi_json (1.15.0)
multipart-post (2.3.0)
+ ollama-ai (1.0.0)
+ faraday (~> 2.8)
os (1.1.4)
parallel (1.24.0)
- parser (3.2.2.4)
+ parser (3.3.0.0)
ast (~> 2.4.1)
racc
pry (0.14.2)
@@ -104,11 +110,11 @@ GEM
unicode-display_width (>= 2.4.0, < 3.0)
rubocop-ast (1.30.0)
parser (>= 3.2.1.0)
- rubocop-capybara (2.19.0)
+ rubocop-capybara (2.20.0)
rubocop (~> 1.41)
- rubocop-factory_bot (2.24.0)
+ rubocop-factory_bot (2.25.0)
rubocop (~> 1.33)
- rubocop-rspec (2.25.0)
+ rubocop-rspec (2.26.1)
rubocop (~> 1.40)
rubocop-capybara (~> 2.17)
rubocop-factory_bot (~> 2.22)
diff --git a/README.md b/README.md
index 80e10e3..5490494 100644
--- a/README.md
+++ b/README.md
@@ -1,218 +1,109 @@
# Nano Bots 💎 🤖
-A Ruby implementation of the [Nano Bots](https://github.com/icebaker/nano-bots) specification with support for [OpenAI ChatGPT](https://openai.com/chatgpt), [Mistral AI](https://mistral.ai), [Cohere Command](https://cohere.com), and [Google Gemini](https://deepmind.google/technologies/gemini).
+An implementation of the [Nano Bots](https://spec.nbots.io) specification with support for [Cohere Command](https://cohere.com), [Google Gemini](https://deepmind.google/technologies/gemini), [Maritaca AI MariTalk](https://www.maritaca.ai), [Mistral AI](https://mistral.ai), [Ollama](https://ollama.ai), [OpenAI ChatGPT](https://openai.com/chatgpt), and others.
![Ruby Nano Bots](https://raw.githubusercontent.com/icebaker/assets/main/nano-bots/ruby-nano-bots-canvas.png)
https://user-images.githubusercontent.com/113217272/238141567-c58a240c-7b67-4b3b-864a-0f49bbf6e22f.mp4
-- [Setup](#setup)
- - [Cohere Command](#cohere-command)
- - [Google Gemini](#google-gemini)
- - [Mistral AI](#mistral-ai)
- - [OpenAI ChatGPT](#openai-chatgpt)
-- [Usage](#usage)
- - [Command Line](#command-line)
- - [Library](#library)
-- [Cartridges](#cartridges)
- - [Cohere Command](#cohere-command-1)
- - [Google Gemini](#google-gemini-1)
- - [Mistral AI](#mistral-ai-1)
- - [OpenAI ChatGPT](#openai-chatgpt-1)
- - [Tools (Functions)](#tools-functions)
- - [Experimental Clojure Support](#experimental-clojure-support)
- - [Marketplace](#marketplace)
-- [Docker](#docker)
- - [Cohere Command](#cohere-command-2)
- - [Google Gemini](#google-gemini-2)
- - [Mistral AI](#mistral-ai-2)
- - [OpenAI ChatGPT](#openai-chatgpt-2)
-- [Security and Privacy](#security-and-privacy)
- - [Cryptography](#cryptography)
- - [End-user IDs](#end-user-ids)
- - [Decrypting](#decrypting)
-- [Providers](#providers)
-- [Debugging](#debugging)
-- [Development](#development)
- - [Publish to RubyGems](#publish-to-rubygems)
-
-## Setup
-
-For a system usage:
+## TL;DR and Quick Start
```sh
gem install nano-bots -v 2.4.1
```
-To use it in a project, add it to your `Gemfile`:
-
-```ruby
-gem 'nano-bots', '~> 2.4.1'
-```
-
-```sh
-bundle install
-```
-
-For credentials and configurations, relevant environment variables can be set in your `.bashrc`, `.zshrc`, or equivalent files, as well as in your Docker Container or System Environment. Example:
-
-```sh
-export NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-export NANO_BOTS_END_USER=your-user
-
-# export NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# export NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
-```
-
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-NANO_BOTS_END_USER=your-user
-
-# NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
-```
-
-### Cohere Command
-
-You can obtain your credentials on the [Cohere Platform](https://dashboard.cohere.com).
-
-```sh
-export COHERE_API_ADDRESS=https://api.cohere.ai
-export COHERE_API_KEY=your-api-key
-```
-
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-COHERE_API_ADDRESS=https://api.cohere.ai
-COHERE_API_KEY=your-api-key
-```
-
-### Mistral AI
-
-You can obtain your credentials on the [Mistral Platform](https://console.mistral.ai).
-
-```sh
-export MISTRAL_API_ADDRESS=https://api.mistral.ai
-export MISTRAL_API_KEY=your-api-key
-```
-
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-MISTRAL_API_ADDRESS=https://api.mistral.ai
-MISTRAL_API_KEY=your-api-key
-```
-
-### OpenAI ChatGPT
-
-You can obtain your credentials on the [OpenAI Platform](https://platform.openai.com).
-
-```sh
-export OPENAI_API_ADDRESS=https://api.openai.com
-export OPENAI_API_KEY=your-access-token
-```
-
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-OPENAI_API_ADDRESS=https://api.openai.com
-OPENAI_API_KEY=your-access-token
+```bash
+nb - - eval "hello"
+# => Hello! How may I assist you today?
```
-### Google Gemini
-
-Click [here](https://github.com/gbaptista/gemini-ai#credentials) to learn how to obtain your credentials.
-
-#### Option 1: API Key (Generative Language API)
-
-```sh
-export GOOGLE_API_KEY=your-api-key
-
-export NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-export NANO_BOTS_END_USER=your-user
-
-# export NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# export NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+```bash
+nb - - repl
```
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-GOOGLE_API_KEY=your-api-key
+```text
+🤖> Hi, how are you doing?
-NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-NANO_BOTS_END_USER=your-user
+As an AI language model, I do not experience emotions but I am functioning
+well. How can I assist you?
-# NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+🤖> |
```
-#### Option 2: Service Account Credentials File (Vertex AI API)
-
-```sh
-export GOOGLE_CREDENTIALS_FILE_PATH=google-credentials.json
-export GOOGLE_REGION=us-east4
-
-export NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-export NANO_BOTS_END_USER=your-user
+```yaml
+---
+meta:
+ symbol: 🤖
+ name: ChatGPT
-# export NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# export NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+provider:
+ id: openai
+ credentials:
+ access-token: ENV/OPENAI_API_KEY
+ settings:
+ model: gpt-4-1106-preview
```
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-GOOGLE_CREDENTIALS_FILE_PATH=google-credentials.json
-GOOGLE_REGION=us-east4
-
-NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-NANO_BOTS_END_USER=your-user
-
-# NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+```bash
+nb gpt.yml - eval "hi"
+# => Hello! How can I assist you today?
```
-#### Option 3: Application Default Credentials (Vertex AI API)
-
-```sh
-export GOOGLE_REGION=us-east4
-
-export NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-export NANO_BOTS_END_USER=your-user
-
-# export NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# export NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+```ruby
+gem 'nano-bots', '~> 2.4.1'
```
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-GOOGLE_REGION=us-east4
-
-NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
-NANO_BOTS_END_USER=your-user
-
-# NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
-# NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
-```
+```ruby
+require 'nano-bots'
-#### Custom Project ID
+bot = NanoBot.new(cartridge: 'gpt.yml')
-If you need to manually set a Google Project ID:
+bot.eval('Hi!') do |content, fragment, finished, meta|
+ print fragment unless fragment.nil?
+end
-```sh
-export GOOGLE_PROJECT_ID=your-project-id
+# => Hello! How can I assist you today?
```
-Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
-
-```sh
-GOOGLE_PROJECT_ID=your-project-id
-```
+- [TL;DR and Quick Start](#tldr-and-quick-start)
+- [Usage](#usage)
+ - [Command Line](#command-line)
+ - [Debugging](#debugging)
+ - [Library](#library)
+- [Setup](#setup)
+ - [Cohere Command](#cohere-command)
+ - [Maritaca AI MariTalk](#maritaca-ai-maritalk)
+ - [Mistral AI](#mistral-ai)
+ - [Ollama](#ollama)
+ - [OpenAI ChatGPT](#openai-chatgpt)
+ - [Google Gemini](#google-gemini)
+ - [Option 1: API Key (Generative Language API)](#option-1-api-key-generative-language-api)
+ - [Option 2: Service Account Credentials File (Vertex AI API)](#option-2-service-account-credentials-file-vertex-ai-api)
+ - [Option 3: Application Default Credentials (Vertex AI API)](#option-3-application-default-credentials-vertex-ai-api)
+ - [Custom Project ID](#custom-project-id)
+- [Cartridges](#cartridges)
+ - [Tools (Functions)](#tools-functions)
+ - [Experimental Clojure Support](#experimental-clojure-support)
+ - [Marketplace](#marketplace)
+- [Security and Privacy](#security-and-privacy)
+ - [Cryptography](#cryptography)
+ - [End-user IDs](#end-user-ids)
+ - [Decrypting](#decrypting)
+- [Supported Providers](#supported-providers)
+- [Docker](#docker)
+ - [Cohere Command Container](#cohere-command-container)
+ - [Maritaca AI MariTalk Container](#maritaca-ai-maritalk-container)
+ - [Mistral AI Container](#mistral-ai-container)
+ - [Ollama Container](#ollama-container)
+ - [OpenAI ChatGPT Container](#openai-chatgpt-container)
+ - [Google Gemini Container](#google-gemini-container)
+ - [Option 1: API Key (Generative Language API) Config](#option-1-api-key-generative-language-api-config)
+ - [Option 2: Service Account Credentials File (Vertex AI API) Config](#option-2-service-account-credentials-file-vertex-ai-api-config)
+ - [Option 3: Application Default Credentials (Vertex AI API) Config](#option-3-application-default-credentials-vertex-ai-api-config)
+ - [Custom Project ID Config](#custom-project-id-config)
+ - [Running the Container](#running-the-container)
+- [Development](#development)
+ - [Publish to RubyGems](#publish-to-rubygems)
## Usage
@@ -345,17 +236,59 @@ bot.boot do |content, fragment, finished, meta|
end
```
-## Cartridges
+## Setup
-Check the Nano Bots specification to learn more about [how to build cartridges](https://spec.nbots.io/#/README?id=cartridges).
+To install the CLI on your system:
-Try the [Nano Bots Clinic (Live Editor)](https://clinic.nbots.io) to learn about creating Cartridges.
+```sh
+gem install nano-bots -v 2.4.1
+```
-Here's what a Nano Bot Cartridge looks like:
+To use it in a Ruby project as a library, add to your `Gemfile`:
+
+```ruby
+gem 'nano-bots', '~> 2.4.1'
+```
+
+```sh
+bundle install
+```
+
+For credentials and configurations, relevant environment variables can be set in your `.bashrc`, `.zshrc`, or equivalent files, as well as in your Docker Container or System Environment. Example:
+
+```sh
+export NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
+export NANO_BOTS_END_USER=your-user
+
+# export NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
+# export NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
+NANO_BOTS_END_USER=your-user
+
+# NANO_BOTS_STATE_DIRECTORY=/home/user/.local/state/nano-bots
+# NANO_BOTS_CARTRIDGES_DIRECTORY=/home/user/.local/share/nano-bots/cartridges
+```
### Cohere Command
-Read the [full specification](https://spec.nbots.io/#/README?id=cohere-command) for Cohere Command.
+You can obtain your credentials on the [Cohere Platform](https://dashboard.cohere.com).
+
+```sh
+export COHERE_API_KEY=your-api-key
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+COHERE_API_KEY=your-api-key
+```
+
+Create a `cartridge.yml` file:
```yaml
---
@@ -379,10 +312,90 @@ provider:
model: command
```
-### Mistral AI
+Read the [full specification](https://spec.nbots.io/#/README?id=cohere-command) for Cohere Command.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
+### Maritaca AI MariTalk
+
+You can obtain your API key at [MariTalk](https://chat.maritaca.ai).
+
+Enclose credentials in single quotes when using environment variables to prevent issues with the $ character in the API key:
+
+```sh
+export MARITACA_API_KEY='123...$a12...'
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+MARITACA_API_KEY='123...$a12...'
+```
+
+Create a `cartridge.yml` file:
+
+```yaml
+---
+meta:
+ symbol: 🤖
+ name: Nano Bot Name
+ author: Your Name
+ version: 1.0.0
+ license: CC0-1.0
+ description: A helpful assistant.
+
+behaviors:
+ interaction:
+ directive: You are a helpful assistant.
+
+provider:
+ id: maritaca
+ credentials:
+ api-key: ENV/MARITACA_API_KEY
+ settings:
+ model: maritalk
+```
Read the [full specification](https://spec.nbots.io/#/README?id=mistral-ai) for Mistral AI.
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
+### Mistral AI
+
+You can obtain your credentials on the [Mistral Platform](https://console.mistral.ai).
+
+```sh
+export MISTRAL_API_KEY=your-api-key
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+MISTRAL_API_KEY=your-api-key
+```
+
+Create a `cartridge.yml` file:
+
```yaml
---
meta:
@@ -405,9 +418,87 @@ provider:
model: mistral-medium
```
+Read the [full specification](https://spec.nbots.io/#/README?id=mistral-ai) for Mistral AI.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
+### Ollama
+
+To install and set up, follow the instructions on the [Ollama](https://ollama.ai) website.
+
+```sh
+export OLLAMA_API_ADDRESS=http://localhost:11434
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+OLLAMA_API_ADDRESS=http://localhost:11434
+```
+
+Create a `cartridge.yml` file:
+
+```yaml
+---
+meta:
+ symbol: 🤖
+ name: Nano Bot Name
+ author: Your Name
+ version: 1.0.0
+ license: CC0-1.0
+ description: A helpful assistant.
+
+behaviors:
+ interaction:
+ directive: You are a helpful assistant.
+
+provider:
+ id: ollama
+ credentials:
+ address: ENV/OLLAMA_API_ADDRESS
+ settings:
+ model: dolphin-phi
+```
+
+Read the [full specification](https://spec.nbots.io/#/README?id=ollama) for Ollama.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
### OpenAI ChatGPT
-Read the [full specification](https://spec.nbots.io/#/README?id=openai-chatgpt) for OpenAI ChatGPT.
+You can obtain your credentials on the [OpenAI Platform](https://platform.openai.com).
+
+```sh
+export OPENAI_API_KEY=your-access-token
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+OPENAI_API_KEY=your-access-token
+```
+
+Create a `cartridge.yml` file:
```yaml
---
@@ -426,19 +517,44 @@ behaviors:
provider:
id: openai
credentials:
- address: ENV/OPENAI_API_ADDRESS
access-token: ENV/OPENAI_API_KEY
settings:
user: ENV/NANO_BOTS_END_USER
model: gpt-4-1106-preview
```
+Read the [full specification](https://spec.nbots.io/#/README?id=openai-chatgpt) for OpenAI ChatGPT.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
### Google Gemini
-Read the [full specification](https://spec.nbots.io/#/README?id=google-gemini) for Google Gemini.
+Click [here](https://github.com/gbaptista/gemini-ai#credentials) to learn how to obtain your credentials.
#### Option 1: API Key (Generative Language API)
+```sh
+export GOOGLE_API_KEY=your-api-key
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+GOOGLE_API_KEY=your-api-key
+```
+
+Create a `cartridge.yml` file:
+
```yaml
---
meta:
@@ -462,8 +578,36 @@ provider:
model: gemini-pro
```
+Read the [full specification](https://spec.nbots.io/#/README?id=google-gemini) for Google Gemini.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
#### Option 2: Service Account Credentials File (Vertex AI API)
+```sh
+export GOOGLE_CREDENTIALS_FILE_PATH=google-credentials.json
+export GOOGLE_REGION=us-east4
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+GOOGLE_CREDENTIALS_FILE_PATH=google-credentials.json
+GOOGLE_REGION=us-east4
+```
+
+Create a `cartridge.yml` file:
+
```yaml
---
meta:
@@ -488,8 +632,34 @@ provider:
model: gemini-pro
```
+Read the [full specification](https://spec.nbots.io/#/README?id=google-gemini) for Google Gemini.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
#### Option 3: Application Default Credentials (Vertex AI API)
+```sh
+export GOOGLE_REGION=us-east4
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+GOOGLE_REGION=us-east4
+```
+
+Create a `cartridge.yml` file:
+
```yaml
---
meta:
@@ -513,10 +683,36 @@ provider:
model: gemini-pro
```
+Read the [full specification](https://spec.nbots.io/#/README?id=google-gemini) for Google Gemini.
+
+```bash
+nb cartridge.yml - eval "Hello"
+
+nb cartridge.yml - repl
+```
+
+```ruby
+bot = NanoBot.new(cartridge: 'cartridge.yml')
+
+puts bot.eval('Hello')
+```
+
#### Custom Project ID
If you need to manually set a Google Project ID:
+```sh
+export GOOGLE_PROJECT_ID=your-project-id
+```
+
+Alternatively, if your current directory has a `.env` file with the environment variables, they will be automatically loaded:
+
+```sh
+GOOGLE_PROJECT_ID=your-project-id
+```
+
+Add to your `cartridge.yml` file:
+
```yaml
---
provider:
@@ -525,6 +721,37 @@ provider:
project-id: ENV/GOOGLE_PROJECT_ID
```
+## Cartridges
+
+Check the Nano Bots specification to learn more about [how to build cartridges](https://spec.nbots.io/#/README?id=cartridges).
+
+Try the [Nano Bots Clinic (Live Editor)](https://clinic.nbots.io) to learn about creating Cartridges.
+
+Here's what a Nano Bot Cartridge looks like:
+
+```yaml
+---
+meta:
+ symbol: 🤖
+ name: Nano Bot Name
+ author: Your Name
+ version: 1.0.0
+ license: CC0-1.0
+ description: A helpful assistant.
+
+behaviors:
+ interaction:
+ directive: You are a helpful assistant.
+
+provider:
+ id: openai
+ credentials:
+ access-token: ENV/OPENAI_API_KEY
+ settings:
+ user: ENV/NANO_BOTS_END_USER
+ model: gpt-4-1106-preview
+```
+
### Tools (Functions)
Nano Bots can also be powered by _Tools_ (Functions):
@@ -550,7 +777,7 @@ The randomly generated number is 59.
🤖> |
```
-To successfully use Tools (Functions), you need to specify a provider and a model that supports them. As of the writing of this README, the provider that supports them is [OpenAI](https://platform.openai.com/docs/models), with models `gpt-3.5-turbo-1106` and `gpt-4-1106-preview`, and [Google](https://cloud.google.com/vertex-ai/docs/generative-ai/multimodal/function-calling#supported_models), with the `vertex-ai-api` service and the model `gemini-pro`. Mistral AI does not support tools.
+To successfully use Tools (Functions), you need to specify a provider and a model that supports them. As of the writing of this README, the provider that supports them is [OpenAI](https://platform.openai.com/docs/models), with models `gpt-3.5-turbo-1106` and `gpt-4-1106-preview`, and [Google](https://cloud.google.com/vertex-ai/docs/generative-ai/multimodal/function-calling#supported_models), with the `vertex-ai-api` service and the model `gemini-pro`. Other providers do not yet have support.
Check the [Nano Bots specification](https://spec.nbots.io/#/README?id=tools-functions-2) to learn more about Tools (Functions).
@@ -595,6 +822,128 @@ safety:
You can explore the Nano Bots [Marketplace](https://nbots.io) to discover new Cartridges that can help you.
+## Security and Privacy
+
+Each provider will have its own security and privacy policies (e.g. [OpenAI Policy](https://openai.com/policies/api-data-usage-policies)), so you must consult them to understand their implications.
+
+### Cryptography
+
+By default, all states stored in your local disk are encrypted.
+
+To ensure that the encryption is secure, you need to define a password through the `NANO_BOTS_ENCRYPTION_PASSWORD` environment variable. Otherwise, although the content will be encrypted, anyone would be able to decrypt it without a password.
+
+It's important to note that the content shared with providers, despite being transmitted over secure connections (e.g., [HTTPS](https://en.wikipedia.org/wiki/HTTPS)), will be readable by the provider. This is because providers need to operate on the data, which would not be possible if the content was encrypted beyond HTTPS. So, the data stored locally on your system is encrypted, which does not mean that what you share with providers will not be readable by them.
+
+To ensure that your encryption and password are configured properly, you can run the following command:
+```sh
+nb security
+```
+
+Which should return:
+```text
+✅ Encryption is enabled and properly working.
+ This means that your data is stored in an encrypted format on your disk.
+
+✅ A password is being used for the encrypted content.
+ This means that only those who possess the password can decrypt your data.
+```
+
+Alternatively, you can check it at runtime with:
+```ruby
+require 'nano-bots'
+
+NanoBot.security.check
+# => { encryption: true, password: true }
+```
+
+### End-user IDs
+
+A common strategy for deploying Nano Bots to multiple users through APIs or automations is to assign a unique [end-user ID](https://platform.openai.com/docs/guides/safety-best-practices/end-user-ids) for each user. This can be useful if any of your users violate the provider's policy due to abusive behavior. By providing the end-user ID, you can unravel that even though the activity originated from your API Key, the actions taken were not your own.
+
+You can define custom end-user identifiers in the following way:
+
+```ruby
+NanoBot.new(environment: { NANO_BOTS_END_USER: 'custom-user-a' })
+NanoBot.new(environment: { NANO_BOTS_END_USER: 'custom-user-b' })
+```
+
+Consider that you have the following end-user identifier in your environment:
+```sh
+NANO_BOTS_END_USER=your-name
+```
+
+Or a configuration in your Cartridge:
+```yml
+---
+provider:
+ id: openai
+ settings:
+ user: your-name
+```
+
+The requests will be performed as follows:
+
+```ruby
+NanoBot.new(cartridge: '-')
+# { user: 'your-name' }
+
+NanoBot.new(cartridge: '-', environment: { NANO_BOTS_END_USER: 'custom-user-a' })
+# { user: 'custom-user-a' }
+
+NanoBot.new(cartridge: '-', environment: { NANO_BOTS_END_USER: 'custom-user-b' })
+# { user: 'custom-user-b' }
+```
+
+Actually, to enhance privacy, neither your user nor your users' identifiers will be shared in this way. Instead, they will be encrypted before being shared with the provider:
+
+```ruby
+'your-name'
+# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZqpsAkPg4j62SeNYlgwq3kn51Ob2wmIehoA==
+
+'custom-user-a'
+# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZJgIXHCBHyADW-rn4IQr-s2RvP7vym8u5tnzYMIs=
+
+'custom-user-b'
+# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZkjUwCcsh9sVppKvYMhd2qGRvP7vym8u5tnzYMIg=
+```
+
+In this manner, you possess identifiers if required, however, their actual content can only be decrypted by you via your secure password (`NANO_BOTS_ENCRYPTION_PASSWORD`).
+
+### Decrypting
+
+To decrypt your encrypted data, once you have properly configured your password, you can simply run:
+
+```ruby
+require 'nano-bots'
+
+NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZqpsAkPg4j62SeNYlgwq3kn51Ob2wmIehoA==')
+# your-name
+
+NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZJgIXHCBHyADW-rn4IQr-s2RvP7vym8u5tnzYMIs=')
+# custom-user-a
+
+NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZkjUwCcsh9sVppKvYMhd2qGRvP7vym8u5tnzYMIg=')
+# custom-user-b
+```
+
+If you lose your password, you lose your data. It is not possible to recover it at all. For real.
+
+## Supported Providers
+
+- [ ] [Anthropic Claude](https://www.anthropic.com)
+- [x] [Cohere Command](https://cohere.com)
+- [x] [Google Gemini](https://deepmind.google/technologies/gemini)
+- [x] [Maritaca AI MariTalk](https://www.maritaca.ai)
+- [x] [Mistral AI](https://mistral.ai)
+- [x] [Ollama](https://ollama.ai)
+ - [x] [01.AI Yi](https://01.ai)
+ - [x] [LMSYS Vicuna](https://github.com/lm-sys/FastChat)
+ - [x] [Meta Llama](https://ai.meta.com/llama/)
+ - [x] [WizardLM](https://wizardlm.github.io)
+- [x] [Open AI ChatGPT](https://openai.com/chatgpt)
+
+01.AI Yi, LMSYS Vicuna, Meta Llama, and WizardLM are open-source models that are supported through [Ollama](https://ollama.ai).
+
## Docker
Clone the repository and copy the Docker Compose template:
@@ -607,7 +956,7 @@ cp docker-compose.example.yml docker-compose.yml
Set your provider credentials and choose your desired directory for the cartridges files:
-### Cohere Command
+### Cohere Command Container
```yaml
---
@@ -616,7 +965,6 @@ services:
image: ruby:3.2.2-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 2.4.1 && bash"
environment:
- COHERE_API_ADDRESS: https://api.cohere.ai
COHERE_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
@@ -625,7 +973,24 @@ services:
- ./your-state-path:/root/.local/state/nano-bots
```
-### Mistral AI
+### Maritaca AI MariTalk Container
+
+```yaml
+---
+services:
+ nano-bots:
+ image: ruby:3.2.2-slim-bookworm
+ command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 2.4.1 && bash"
+ environment:
+ MARITACA_API_KEY: your-api-key
+ NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
+ NANO_BOTS_END_USER: your-user
+ volumes:
+ - ./your-cartridges:/root/.local/share/nano-bots/cartridges
+ - ./your-state-path:/root/.local/state/nano-bots
+```
+
+### Mistral AI Container
```yaml
---
@@ -634,7 +999,6 @@ services:
image: ruby:3.2.2-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 2.4.1 && bash"
environment:
- MISTRAL_API_ADDRESS: https://api.mistral.ai
MISTRAL_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
@@ -643,7 +1007,26 @@ services:
- ./your-state-path:/root/.local/state/nano-bots
```
-### OpenAI ChatGPT
+### Ollama Container
+
+Remember that your `localhost` is inaccessible from inside Docker. You need to either establish [inter-container networking](https://docs.docker.com/compose/networking/) or use the [host's address](https://docs.docker.com/desktop/networking/#i-want-to-connect-from-a-container-to-a-service-on-the-host), depending on where the Ollama server is running.
+
+```yaml
+---
+services:
+ nano-bots:
+ image: ruby:3.2.2-slim-bookworm
+ command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 2.4.1 && bash"
+ environment:
+ OLLAMA_API_ADDRESS: http://host.docker.internal:11434
+ NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
+ NANO_BOTS_END_USER: your-user
+ volumes:
+ - ./your-cartridges:/root/.local/share/nano-bots/cartridges
+ - ./your-state-path:/root/.local/state/nano-bots
+```
+
+### OpenAI ChatGPT Container
```yaml
---
@@ -652,7 +1035,6 @@ services:
image: ruby:3.2.2-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 2.4.1 && bash"
environment:
- OPENAI_API_ADDRESS: https://api.openai.com
OPENAI_API_KEY: your-access-token
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
@@ -661,9 +1043,9 @@ services:
- ./your-state-path:/root/.local/state/nano-bots
```
-### Google Gemini
+### Google Gemini Container
-#### Option 1: API Key (Generative Language API)
+#### Option 1: API Key (Generative Language API) Config
```yaml
---
@@ -680,7 +1062,7 @@ services:
- ./your-state-path:/root/.local/state/nano-bots
```
-#### Option 2: Service Account Credentials File (Vertex AI API)
+#### Option 2: Service Account Credentials File (Vertex AI API) Config
```yaml
---
@@ -699,7 +1081,7 @@ services:
- ./your-state-path:/root/.local/state/nano-bots
```
-#### Option 3: Application Default Credentials (Vertex AI API)
+#### Option 3: Application Default Credentials (Vertex AI API) Config
```yaml
---
@@ -716,7 +1098,7 @@ services:
- ./your-state-path:/root/.local/state/nano-bots
```
-#### Custom Project ID
+#### Custom Project ID Config
If you need to manually set a Google Project ID:
```yaml
@@ -724,7 +1106,7 @@ environment:
GOOGLE_PROJECT_ID=your-project-id
```
-### Container
+### Running the Container
Enter the container:
```sh
@@ -742,128 +1124,6 @@ nb assistant.yml - repl
You can exit the REPL by typing `exit`.
-## Security and Privacy
-
-Each provider will have its own security and privacy policies (e.g. [OpenAI Policy](https://openai.com/policies/api-data-usage-policies)), so you must consult them to understand their implications.
-
-### Cryptography
-
-By default, all states stored in your local disk are encrypted.
-
-To ensure that the encryption is secure, you need to define a password through the `NANO_BOTS_ENCRYPTION_PASSWORD` environment variable. Otherwise, although the content will be encrypted, anyone would be able to decrypt it without a password.
-
-It's important to note that the content shared with providers, despite being transmitted over secure connections (e.g., [HTTPS](https://en.wikipedia.org/wiki/HTTPS)), will be readable by the provider. This is because providers need to operate on the data, which would not be possible if the content was encrypted beyond HTTPS. So, the data stored locally on your system is encrypted, which does not mean that what you share with providers will not be readable by them.
-
-To ensure that your encryption and password are configured properly, you can run the following command:
-```sh
-nb security
-```
-
-Which should return:
-```text
-✅ Encryption is enabled and properly working.
- This means that your data is stored in an encrypted format on your disk.
-
-✅ A password is being used for the encrypted content.
- This means that only those who possess the password can decrypt your data.
-```
-
-Alternatively, you can check it at runtime with:
-```ruby
-require 'nano-bots'
-
-NanoBot.security.check
-# => { encryption: true, password: true }
-```
-
-### End-user IDs
-
-A common strategy for deploying Nano Bots to multiple users through APIs or automations is to assign a unique [end-user ID](https://platform.openai.com/docs/guides/safety-best-practices/end-user-ids) for each user. This can be useful if any of your users violate the provider's policy due to abusive behavior. By providing the end-user ID, you can unravel that even though the activity originated from your API Key, the actions taken were not your own.
-
-You can define custom end-user identifiers in the following way:
-
-```ruby
-NanoBot.new(environment: { NANO_BOTS_END_USER: 'custom-user-a' })
-NanoBot.new(environment: { NANO_BOTS_END_USER: 'custom-user-b' })
-```
-
-Consider that you have the following end-user identifier in your environment:
-```sh
-NANO_BOTS_END_USER=your-name
-```
-
-Or a configuration in your Cartridge:
-```yml
----
-provider:
- id: openai
- settings:
- user: your-name
-```
-
-The requests will be performed as follows:
-
-```ruby
-NanoBot.new(cartridge: '-')
-# { user: 'your-name' }
-
-NanoBot.new(cartridge: '-', environment: { NANO_BOTS_END_USER: 'custom-user-a' })
-# { user: 'custom-user-a' }
-
-NanoBot.new(cartridge: '-', environment: { NANO_BOTS_END_USER: 'custom-user-b' })
-# { user: 'custom-user-b' }
-```
-
-Actually, to enhance privacy, neither your user nor your users' identifiers will be shared in this way. Instead, they will be encrypted before being shared with the provider:
-
-```ruby
-'your-name'
-# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZqpsAkPg4j62SeNYlgwq3kn51Ob2wmIehoA==
-
-'custom-user-a'
-# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZJgIXHCBHyADW-rn4IQr-s2RvP7vym8u5tnzYMIs=
-
-'custom-user-b'
-# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZkjUwCcsh9sVppKvYMhd2qGRvP7vym8u5tnzYMIg=
-```
-
-In this manner, you possess identifiers if required, however, their actual content can only be decrypted by you via your secure password (`NANO_BOTS_ENCRYPTION_PASSWORD`).
-
-### Decrypting
-
-To decrypt your encrypted data, once you have properly configured your password, you can simply run:
-
-```ruby
-require 'nano-bots'
-
-NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZqpsAkPg4j62SeNYlgwq3kn51Ob2wmIehoA==')
-# your-name
-
-NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZJgIXHCBHyADW-rn4IQr-s2RvP7vym8u5tnzYMIs=')
-# custom-user-a
-
-NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZkjUwCcsh9sVppKvYMhd2qGRvP7vym8u5tnzYMIg=')
-# custom-user-b
-```
-
-If you lose your password, you lose your data. It is not possible to recover it at all. For real.
-
-## Providers
-
-Currently supported providers:
-
-- [ ] [01.AI Yi](https://01.ai)
-- [ ] [Anthropic Claude](https://www.anthropic.com)
-- [x] [Cohere Command](https://docs.cohere.com/reference/about)
-- [x] [Google Gemini](https://cloud.google.com/vertex-ai/docs/generative-ai/model-reference/gemini)
-- [ ] [LMSYS Org FastChat Vicuna](https://github.com/lm-sys/FastChat)
-- [ ] [Meta Llama](https://ai.meta.com/llama/)
-- [x] [Mistral AI](https://docs.mistral.ai/api/)
-- [x] [Open AI ChatGPT](https://platform.openai.com/docs/api-reference)
-- [ ] [WizardLM](https://wizardlm.github.io)
-
-Some providers offer APIs that are compatible with, for example, OpenAI, such as [FastChat](https://github.com/lm-sys/FastChat#openai-compatible-restful-apis--sdk). Therefore, it is highly probable that they will work just fine.
-
## Development
```bash
@@ -872,6 +1132,9 @@ rubocop -A
rspec
bundle exec ruby spec/tasks/run-all-models.rb
+
+bundle exec ruby spec/tasks/run-model.rb spec/data/cartridges/models/openai/gpt-4-turbo.yml
+bundle exec ruby spec/tasks/run-model.rb spec/data/cartridges/models/openai/gpt-4-turbo.yml stream
```
### Publish to RubyGems
diff --git a/components/provider.rb b/components/provider.rb
index ac3964d..4c409d2 100644
--- a/components/provider.rb
+++ b/components/provider.rb
@@ -1,9 +1,11 @@
# frozen_string_literal: true
-require_relative 'providers/google'
-require_relative 'providers/mistral'
require_relative 'providers/openai'
+require_relative 'providers/ollama'
+require_relative 'providers/mistral'
+require_relative 'providers/google'
require_relative 'providers/cohere'
+require_relative 'providers/maritaca'
module NanoBot
module Components
@@ -12,12 +14,16 @@ module NanoBot
case provider[:id]
when 'openai'
Providers::OpenAI.new(nil, provider[:settings], provider[:credentials], environment:)
- when 'google'
- Providers::Google.new(provider[:options], provider[:settings], provider[:credentials], environment:)
+ when 'ollama'
+ Providers::Ollama.new(provider[:options], provider[:settings], provider[:credentials], environment:)
when 'mistral'
Providers::Mistral.new(provider[:options], provider[:settings], provider[:credentials], environment:)
+ when 'google'
+ Providers::Google.new(provider[:options], provider[:settings], provider[:credentials], environment:)
when 'cohere'
Providers::Cohere.new(provider[:options], provider[:settings], provider[:credentials], environment:)
+ when 'maritaca'
+ Providers::Maritaca.new(provider[:options], provider[:settings], provider[:credentials], environment:)
else
raise "Unsupported provider \"#{provider[:id]}\""
end
diff --git a/components/providers/cohere.rb b/components/providers/cohere.rb
index 9b9f045..970837e 100644
--- a/components/providers/cohere.rb
+++ b/components/providers/cohere.rb
@@ -76,7 +76,7 @@ module NanoBot
if streaming
content = ''
- stream_call_back = proc do |event, _parsed, _raw|
+ stream_call_back = proc do |event, _raw|
partial_content = event['text']
if partial_content && event['event_type'] == 'text-generation'
diff --git a/components/providers/maritaca.rb b/components/providers/maritaca.rb
new file mode 100644
index 0000000..7a6fbe9
--- /dev/null
+++ b/components/providers/maritaca.rb
@@ -0,0 +1,113 @@
+# frozen_string_literal: true
+
+require 'maritaca-ai'
+
+require_relative 'base'
+
+require_relative '../../logic/providers/maritaca/tokens'
+require_relative '../../logic/helpers/hash'
+require_relative '../../logic/cartridge/default'
+
+module NanoBot
+ module Components
+ module Providers
+ class Maritaca < Base
+ attr_reader :settings
+
+ CHAT_SETTINGS = %i[
+ max_tokens model do_sample temperature top_p repetition_penalty stopping_tokens
+ ].freeze
+
+ def initialize(options, settings, credentials, _environment)
+ @settings = settings
+
+ maritaca_options = if options
+ options.transform_keys { |key| key.to_s.gsub('-', '_').to_sym }
+ else
+ {}
+ end
+
+ unless maritaca_options.key?(:stream)
+ maritaca_options[:stream] = Logic::Helpers::Hash.fetch(
+ Logic::Cartridge::Default.instance.values, %i[provider options stream]
+ )
+ end
+
+ maritaca_options[:server_sent_events] = maritaca_options.delete(:stream)
+
+ @client = ::Maritaca.new(
+ credentials: credentials.transform_keys { |key| key.to_s.gsub('-', '_').to_sym },
+ options: maritaca_options
+ )
+ end
+
+ def evaluate(input, streaming, cartridge, &feedback)
+ messages = input[:history].map do |event|
+ { role: event[:who] == 'user' ? 'user' : 'assistant',
+ content: event[:message],
+ _meta: { at: event[:at] } }
+ end
+
+ # TODO: Does Maritaca have system messages?
+ %i[backdrop directive].each do |key|
+ next unless input[:behavior][key]
+
+ messages.prepend(
+ { role: 'user',
+ content: input[:behavior][key],
+ _meta: { at: Time.now } }
+ )
+ end
+
+ payload = { chat_mode: true, messages: }
+
+ CHAT_SETTINGS.each do |key|
+ payload[key] = @settings[key] unless payload.key?(key) || !@settings.key?(key)
+ end
+
+ raise 'Maritaca does not support tools.' if input[:tools]
+
+ if streaming
+ content = ''
+
+ stream_call_back = proc do |event, _raw|
+ partial_content = event['answer']
+
+ if partial_content
+ content += partial_content
+ feedback.call(
+ { should_be_stored: false,
+ interaction: { who: 'AI', message: partial_content } }
+ )
+ end
+ end
+
+ @client.chat_inference(
+ Logic::Maritaca::Tokens.apply_policies!(cartridge, payload),
+ server_sent_events: true, &stream_call_back
+ )
+
+ feedback.call(
+ { should_be_stored: !(content.nil? || content == ''),
+ interaction: content.nil? || content == '' ? nil : { who: 'AI', message: content },
+ finished: true }
+ )
+ else
+ result = @client.chat_inference(
+ Logic::Maritaca::Tokens.apply_policies!(cartridge, payload),
+ server_sent_events: false
+ )
+
+ content = result['answer']
+
+ feedback.call(
+ { should_be_stored: !(content.nil? || content.to_s.strip == ''),
+ interaction: content.nil? || content == '' ? nil : { who: 'AI', message: content },
+ finished: true }
+ )
+ end
+ end
+ end
+ end
+ end
+end
diff --git a/components/providers/ollama.rb b/components/providers/ollama.rb
new file mode 100644
index 0000000..9edb461
--- /dev/null
+++ b/components/providers/ollama.rb
@@ -0,0 +1,132 @@
+# frozen_string_literal: true
+
+require 'ollama-ai'
+
+require_relative 'base'
+
+require_relative '../../logic/providers/ollama/tokens'
+require_relative '../../logic/helpers/hash'
+require_relative '../../logic/cartridge/default'
+
+module NanoBot
+ module Components
+ module Providers
+ class Ollama < Base
+ attr_reader :settings
+
+ CHAT_SETTINGS = %i[
+ model template stream
+ ].freeze
+
+ CHAT_OPTIONS = %i[
+ mirostat mirostat_eta mirostat_tau num_ctx num_gqa num_gpu num_thread repeat_last_n
+ repeat_penalty temperature seed stop tfs_z num_predict top_k top_p
+ ].freeze
+
+ def initialize(options, settings, credentials, _environment)
+ @settings = settings
+
+ ollama_options = if options
+ options.transform_keys { |key| key.to_s.gsub('-', '_').to_sym }
+ else
+ {}
+ end
+
+ unless @settings.key?(:stream)
+ @settings = Marshal.load(Marshal.dump(@settings))
+ @settings[:stream] = Logic::Helpers::Hash.fetch(
+ Logic::Cartridge::Default.instance.values, %i[provider settings stream]
+ )
+ end
+
+ ollama_options[:server_sent_events] = @settings[:stream]
+
+ credentials ||= {}
+
+ @client = ::Ollama.new(
+ credentials: credentials.transform_keys { |key| key.to_s.gsub('-', '_').to_sym },
+ options: ollama_options
+ )
+ end
+
+ def evaluate(input, streaming, cartridge, &feedback)
+ messages = input[:history].map do |event|
+ { role: event[:who] == 'user' ? 'user' : 'assistant',
+ content: event[:message],
+ _meta: { at: event[:at] } }
+ end
+
+ %i[backdrop directive].each do |key|
+ next unless input[:behavior][key]
+
+ messages.prepend(
+ { role: key == :directive ? 'system' : 'user',
+ content: input[:behavior][key],
+ _meta: { at: Time.now } }
+ )
+ end
+
+ payload = { messages: }
+
+ CHAT_SETTINGS.each do |key|
+ payload[key] = @settings[key] unless payload.key?(key) || !@settings.key?(key)
+ end
+
+ if @settings.key?(:options)
+ options = {}
+
+ CHAT_OPTIONS.each do |key|
+ options[key] = @settings[:options][key] unless options.key?(key) || !@settings[:options].key?(key)
+ end
+
+ payload[:options] = options unless options.empty?
+ end
+
+ raise 'Ollama does not support tools.' if input[:tools]
+
+ if streaming
+ content = ''
+
+ stream_call_back = proc do |event, _raw|
+ partial_content = event.dig('message', 'content')
+
+ if partial_content
+ content += partial_content
+ feedback.call(
+ { should_be_stored: false,
+ interaction: { who: 'AI', message: partial_content } }
+ )
+ end
+
+ if event['done']
+ feedback.call(
+ { should_be_stored: !(content.nil? || content == ''),
+ interaction: content.nil? || content == '' ? nil : { who: 'AI', message: content },
+ finished: true }
+ )
+ end
+ end
+
+ @client.chat(
+ Logic::Ollama::Tokens.apply_policies!(cartridge, payload),
+ server_sent_events: true, &stream_call_back
+ )
+ else
+ result = @client.chat(
+ Logic::Ollama::Tokens.apply_policies!(cartridge, payload),
+ server_sent_events: false
+ )
+
+ content = result.map { |event| event.dig('message', 'content') }.join
+
+ feedback.call(
+ { should_be_stored: !(content.nil? || content.to_s.strip == ''),
+ interaction: content.nil? || content == '' ? nil : { who: 'AI', message: content },
+ finished: true }
+ )
+ end
+ end
+ end
+ end
+ end
+end
diff --git a/logic/cartridge/streaming.rb b/logic/cartridge/streaming.rb
index 23e88ac..b04dc6e 100644
--- a/logic/cartridge/streaming.rb
+++ b/logic/cartridge/streaming.rb
@@ -8,9 +8,9 @@ module NanoBot
module Streaming
def self.enabled?(cartridge, interface)
provider_stream = case Helpers::Hash.fetch(cartridge, %i[provider id])
- when 'openai', 'mistral', 'cohere'
+ when 'openai', 'mistral', 'cohere', 'ollama'
Helpers::Hash.fetch(cartridge, %i[provider settings stream])
- when 'google'
+ when 'google', 'maritaca'
Helpers::Hash.fetch(cartridge, %i[provider options stream])
end
diff --git a/logic/providers/maritaca/tokens.rb b/logic/providers/maritaca/tokens.rb
new file mode 100644
index 0000000..1ae2219
--- /dev/null
+++ b/logic/providers/maritaca/tokens.rb
@@ -0,0 +1,14 @@
+# frozen_string_literal: true
+
+module NanoBot
+ module Logic
+ module Maritaca
+ module Tokens
+ def self.apply_policies!(_cartridge, payload)
+ payload[:messages] = payload[:messages].map { |message| message.except(:_meta) }
+ payload
+ end
+ end
+ end
+ end
+end
diff --git a/logic/providers/ollama/tokens.rb b/logic/providers/ollama/tokens.rb
new file mode 100644
index 0000000..45500fb
--- /dev/null
+++ b/logic/providers/ollama/tokens.rb
@@ -0,0 +1,14 @@
+# frozen_string_literal: true
+
+module NanoBot
+ module Logic
+ module Ollama
+ module Tokens
+ def self.apply_policies!(_cartridge, payload)
+ payload[:messages] = payload[:messages].map { |message| message.except(:_meta) }
+ payload
+ end
+ end
+ end
+ end
+end
diff --git a/nano-bots.gemspec b/nano-bots.gemspec
index d30d44e..946559a 100644
--- a/nano-bots.gemspec
+++ b/nano-bots.gemspec
@@ -32,16 +32,19 @@ Gem::Specification.new do |spec|
spec.executables = ['nb']
spec.add_dependency 'babosa', '~> 2.0'
- spec.add_dependency 'cohere-ai', '~> 1.0', '>= 1.0.1'
spec.add_dependency 'concurrent-ruby', '~> 1.2', '>= 1.2.2'
spec.add_dependency 'dotenv', '~> 2.8', '>= 2.8.1'
- spec.add_dependency 'gemini-ai', '~> 3.1'
- spec.add_dependency 'mistral-ai', '~> 1.1'
spec.add_dependency 'pry', '~> 0.14.2'
spec.add_dependency 'rainbow', '~> 3.1', '>= 3.1.1'
spec.add_dependency 'rbnacl', '~> 7.1', '>= 7.1.1'
- spec.add_dependency 'ruby-openai', '~> 6.3', '>= 6.3.1'
spec.add_dependency 'sweet-moon', '~> 0.0.7'
+ spec.add_dependency 'cohere-ai', '~> 1.0', '>= 1.0.1'
+ spec.add_dependency 'gemini-ai', '~> 3.1', '>= 3.1.2'
+ spec.add_dependency 'maritaca-ai', '~> 1.0'
+ spec.add_dependency 'mistral-ai', '~> 1.1'
+ spec.add_dependency 'ollama-ai', '~> 1.0'
+ spec.add_dependency 'ruby-openai', '~> 6.3', '>= 6.3.1'
+
spec.metadata['rubygems_mfa_required'] = 'true'
end
diff --git a/spec/data/cartridges/models/maritaca/maritalk.yml b/spec/data/cartridges/models/maritaca/maritalk.yml
new file mode 100644
index 0000000..5b99086
--- /dev/null
+++ b/spec/data/cartridges/models/maritaca/maritalk.yml
@@ -0,0 +1,12 @@
+---
+meta:
+ symbol: 🦜
+ name: Maritaca MariTalk
+ license: CC0-1.0
+
+provider:
+ id: maritaca
+ credentials:
+ api-key: ENV/MARITACA_API_KEY
+ settings:
+ model: maritalk
diff --git a/spec/data/cartridges/models/ollama/llama2.yml b/spec/data/cartridges/models/ollama/llama2.yml
new file mode 100644
index 0000000..7f20753
--- /dev/null
+++ b/spec/data/cartridges/models/ollama/llama2.yml
@@ -0,0 +1,10 @@
+---
+meta:
+ symbol: 🦙
+ name: Llama 2 through Ollama
+ license: CC0-1.0
+
+provider:
+ id: ollama
+ settings:
+ model: llama2
diff --git a/spec/tasks/run-model.rb b/spec/tasks/run-model.rb
new file mode 100644
index 0000000..4b235aa
--- /dev/null
+++ b/spec/tasks/run-model.rb
@@ -0,0 +1,39 @@
+# frozen_string_literal: true
+
+require 'dotenv/load'
+
+require 'yaml'
+
+require_relative '../../ports/dsl/nano-bots'
+require_relative '../../logic/helpers/hash'
+
+def run_model!(cartridge, stream = true)
+ if stream == false
+ cartridge[:provider][:options] = {} unless cartridge[:provider].key?(:options)
+ cartridge[:provider][:options][:stream] = false
+
+ cartridge[:provider][:settings] = {} unless cartridge[:provider].key?(:settings)
+ cartridge[:provider][:settings][:stream] = false
+ end
+
+ puts "\n#{cartridge[:meta][:symbol]} #{cartridge[:meta][:name]}\n\n"
+
+ bot = NanoBot.new(cartridge:)
+
+ output = bot.eval('Hi!') do |_content, fragment, _finished, _meta|
+ print fragment unless fragment.nil?
+ end
+ puts ''
+ puts '-' * 20
+ puts ''
+ puts output
+ puts ''
+ puts '*' * 20
+end
+
+run_model!(
+ NanoBot::Logic::Helpers::Hash.symbolize_keys(
+ YAML.safe_load_file(ARGV[0].to_s.strip, permitted_classes: [Symbol])
+ ),
+ ARGV[1].to_s.strip == 'stream'
+)
diff --git a/static/gem.rb b/static/gem.rb
index c9be098..c89b4c3 100644
--- a/static/gem.rb
+++ b/static/gem.rb
@@ -3,11 +3,11 @@
module NanoBot
GEM = {
name: 'nano-bots',
- version: '2.4.1',
- specification: '2.2.0',
+ version: '2.5.0',
+ specification: '2.3.0',
author: 'icebaker',
- summary: 'Ruby Implementation of Nano Bots: small, AI-powered bots for OpenAI ChatGPT, Mistral AI, and Google Gemini.',
- description: 'Ruby Implementation of Nano Bots: small, AI-powered bots that can be easily shared as a single file, designed to support multiple providers such as OpenAI ChatGPT, Mistral AI, and Google Gemini, with support for calling Tools (Functions).',
+ summary: 'Ruby Implementation of Nano Bots: small, AI-powered bots for OpenAI ChatGPT, Ollama, Mistral AI, Cohere Command, Maritaca AI MariTalk, and Google Gemini.',
+ description: 'Ruby Implementation of Nano Bots: small, AI-powered bots that can be easily shared as a single file, designed to support multiple providers such as OpenAI ChatGPT, Ollama, Mistral AI, Cohere Command, Maritaca AI MariTalk, and Google Gemini, with support for calling Tools (Functions).',
github: 'https://github.com/icebaker/ruby-nano-bots',
gem_server: 'https://rubygems.org',
license: 'MIT',