-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Native Ollama LLM Integration + Example Project + Full Unit Tests #3570
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
ayman3000
wants to merge
67
commits into
google:main
Choose a base branch
from
ayman3000:feature/ollama-llm
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+1,044
β0
Open
Changes from all commits
Commits
Show all changes
67 commits
Select commit
Hold shift + click to select a range
4cee25d
Add native Ollama LLM support
ayman3000 fe1ee91
Fix Ollama integration: add model_version, usage metadata, safe JSON β¦
ayman3000 3a786dd
Merge branch 'main' into feature/ollama-llm
ayman3000 28ca391
Merge branch 'main' into feature/ollama-llm
ayman3000 95e601c
Merge branch 'main' into feature/ollama-llm
ayman3000 a40e261
Merge branch 'main' into feature/ollama-llm
ayman3000 040e253
Merge branch 'main' into feature/ollama-llm
ayman3000 a79433c
Merge branch 'main' into feature/ollama-llm
ayman3000 2288461
Fix formatting and imports for CI
ayman3000 b909b56
Fix formatting and imports for CI
ayman3000 b20848c
Merge branch 'main' into feature/ollama-llm
ayman3000 4e8d9f4
Merge branch 'main' into feature/ollama-llm
ayman3000 f0a7138
Merge branch 'main' into feature/ollama-llm
ayman3000 d5fca86
Merge branch 'main' into feature/ollama-llm
ayman3000 74c8d72
Merge branch 'main' into feature/ollama-llm
ayman3000 f97d4bc
Merge branch 'main' into feature/ollama-llm
ayman3000 e2ab4b2
Merge branch 'main' into feature/ollama-llm
ayman3000 b9c11e5
Merge branch 'main' into feature/ollama-llm
ayman3000 0aa1f9f
Merge branch 'main' into feature/ollama-llm
ayman3000 f0b3f98
Fix hello_world_ollama_native/agent.py formatting
ayman3000 d646742
Merge branch 'main' into feature/ollama-llm
ayman3000 e4e33df
Merge branch 'main' into feature/ollama-llm
ayman3000 5b20acf
Merge branch 'main' into feature/ollama-llm
ayman3000 92a8b2a
Merge branch 'main' into feature/ollama-llm
ayman3000 87de44c
Merge branch 'main' into feature/ollama-llm
ayman3000 9318689
Merge branch 'main' into feature/ollama-llm
ayman3000 8bd865b
Merge branch 'main' into feature/ollama-llm
ayman3000 5d6688a
Merge branch 'main' into feature/ollama-llm
ayman3000 e486fe9
Merge branch 'main' into feature/ollama-llm
ayman3000 8e9657e
Merge branch 'main' into feature/ollama-llm
ayman3000 ac7d9d4
Merge branch 'main' into feature/ollama-llm
ayman3000 661340e
Merge branch 'main' into feature/ollama-llm
ayman3000 c98d6a3
Merge branch 'main' into feature/ollama-llm
ayman3000 bb37bd2
Merge branch 'main' into feature/ollama-llm
ayman3000 f86b0a4
Merge branch 'main' into feature/ollama-llm
ayman3000 e9f2e9e
Merge branch 'main' into feature/ollama-llm
ayman3000 440b1ce
Merge branch 'main' into feature/ollama-llm
ayman3000 d1718f0
Merge branch 'main' into feature/ollama-llm
ayman3000 84aaa6d
Merge branch 'main' into feature/ollama-llm
ayman3000 6f1d15c
Merge branch 'main' into feature/ollama-llm
ayman3000 38f54fb
Merge branch 'main' into feature/ollama-llm
ayman3000 6ceae6d
Fix formatting for CI
ayman3000 7ff6397
Merge branch 'main' into feature/ollama-llm
ayman3000 046ddc8
Merge branch 'main' into feature/ollama-llm
ayman3000 869570a
Merge branch 'main' into feature/ollama-llm
ayman3000 f2d933b
Merge branch 'main' into feature/ollama-llm
ayman3000 274bc63
Merge branch 'main' into feature/ollama-llm
ayman3000 97ee91e
Merge branch 'main' into feature/ollama-llm
ayman3000 7bd3a91
Merge branch 'main' into feature/ollama-llm
ayman3000 d78caa7
refactor(tests): clean and reorganize Ollama test suite for readabiliβ¦
ayman3000 938d484
Merge branch 'main' into feature/ollama-llm
ayman3000 864eca7
Merge branch 'main' into feature/ollama-llm
ayman3000 001c5af
Merge branch 'main' into feature/ollama-llm
ayman3000 920cd8d
Merge branch 'main' into feature/ollama-llm
ayman3000 905a0a7
Merge branch 'main' into feature/ollama-llm
ayman3000 f0516f2
Merge branch 'main' into feature/ollama-llm
ayman3000 2e6f43c
Merge branch 'main' into feature/ollama-llm
ayman3000 f8e876d
Merge branch 'main' into feature/ollama-llm
ayman3000 9a968a2
Merge branch 'main' into feature/ollama-llm
ayman3000 b2c1bf3
Merge branch 'main' into feature/ollama-llm
ayman3000 5693b12
fix imports and formatting
ayman3000 8a07c24
Merge branch 'main' into feature/ollama-llm
ayman3000 4078f38
Merge branch 'main' into feature/ollama-llm
ayman3000 93395f6
Fix Ollama host configuration, examples, and documentation mismatches
ayman3000 0353759
Merge branch 'main' into feature/ollama-llm
ayman3000 afa35ea
Merge branch 'main' into feature/ollama-llm
ayman3000 cd23598
Merge branch 'main' into feature/ollama-llm
ayman3000 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
117 changes: 117 additions & 0 deletions
117
contributing/samples/hello_world_ollama_native/README.md
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,117 @@ | ||
| # Using Ollama Models with ADK (Native Integration) | ||
|
|
||
| ## Model Choice | ||
|
|
||
| If your agent uses tools, choose an Ollama model that supports **function calling**. | ||
| Tool support can be verified with: | ||
|
|
||
| ```bash | ||
| ollama show mistral-small3.1 | ||
| ``` | ||
| Model | ||
| architecture mistral3 | ||
| parameters 24.0B | ||
| context length 131072 | ||
| embedding length 5120 | ||
| quantization Q4_K_M | ||
|
|
||
| Capabilities | ||
| completion | ||
| vision | ||
| tools | ||
|
|
||
| Models must list tools under Capabilities. | ||
| Models without tool support will not execute ADK functions correctly. | ||
|
|
||
| To inspect or customize a model template: | ||
| ```bash | ||
| ollama show --modelfile llama3.1 > model_file_to_modify | ||
| ``` | ||
| Then create a modified model: | ||
|
|
||
| ollama create llama3.1-modified -f model_file_to_modify | ||
|
|
||
|
|
||
| ## Native Ollama Provider in ADK | ||
|
|
||
| ADK includes a native Ollama model class that communicates directly with the Ollama server at: | ||
|
|
||
| http://localhost:11434/api/chat | ||
|
|
||
| No LiteLLM provider, API keys, or OpenAI proxy endpoints are needed. | ||
|
|
||
| ### Example agent | ||
| ```python | ||
| import random | ||
| from google.adk.agents.llm_agent import Agent | ||
| from google.adk.models.ollama_llm import Ollama | ||
|
|
||
|
|
||
| def roll_die(sides: int) -> int: | ||
| return random.randint(1, sides) | ||
|
|
||
|
|
||
| def check_prime(numbers: list) -> str: | ||
| """Check if a given list of values contains prime numbers. | ||
|
|
||
| The input may include non-integer values produced by the LLM. | ||
| """ | ||
| primes = set() | ||
|
|
||
| for number in numbers: | ||
| try: | ||
| number = int(number) | ||
| except (ValueError, TypeError): | ||
| continue | ||
|
|
||
| if number <= 1: | ||
| continue | ||
|
|
||
| for i in range(2, int(number**0.5) + 1): | ||
| if number % i == 0: | ||
| break | ||
| else: | ||
| primes.add(number) | ||
|
|
||
| return ( | ||
| "No prime numbers found." | ||
| if not primes | ||
| else f"{', '.join(str(n) for n in sorted(primes))} are prime numbers." | ||
| ) | ||
|
|
||
|
|
||
| root_agent = Agent( | ||
| model=Ollama(model="llama3.1"), | ||
| name="dice_agent", | ||
| description="Agent that rolls dice and checks primes using native Ollama.", | ||
| instruction="Always use the provided tools.", | ||
| tools=[roll_die, check_prime], | ||
| ) | ||
| ``` | ||
| ## Connecting to a remote Ollama server | ||
|
|
||
| Default Ollama endpoint: | ||
|
|
||
| http://localhost:11434 | ||
|
|
||
| Override using an environment variable: | ||
| ```bash | ||
| export OLLAMA_API_BASE="http://192.168.1.20:11434" | ||
| ``` | ||
| Or pass explicitly in code: | ||
| ```python | ||
| Ollama(model="llama3.1", host="http://192.168.1.20:11434") | ||
| ``` | ||
|
|
||
|
|
||
| ## Running the Example with ADK Web | ||
|
|
||
| Start the ADK Web UI: | ||
|
|
||
| adk web hello_ollama_native | ||
|
|
||
| The interface will be available in your browser, allowing interactive testing of tool calls. | ||
|
|
||
|
|
||
|
|
||
|
|
||
15 changes: 15 additions & 0 deletions
15
contributing/samples/hello_world_ollama_native/__init__.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,15 @@ | ||
| # Copyright 2025 Google LLC | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
| from . import agent |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,94 @@ | ||
| # Copyright 2025 Google LLC | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
| import random | ||
|
|
||
| from google.adk.agents.llm_agent import Agent | ||
| from google.adk.models.ollama_llm import Ollama | ||
| from typing import Any | ||
|
|
||
| def roll_die(sides: int) -> int: | ||
| """Roll a die and return the rolled result. | ||
|
|
||
| Args: | ||
| sides: The integer number of sides the die has. | ||
|
|
||
| Returns: | ||
| An integer of the result of rolling the die. | ||
| """ | ||
| return random.randint(1, sides) | ||
|
|
||
|
|
||
| def check_prime(numbers: list[Any]) -> str: | ||
| """Check which values in a list are prime numbers. | ||
|
|
||
| Args: | ||
| numbers: The list of values to check. Values may be non-integers | ||
| and are safely ignored if they cannot be converted. | ||
|
|
||
| Returns: | ||
| A string indicating which numbers are prime. | ||
| """ | ||
| primes = set() | ||
|
|
||
| for number in numbers: | ||
| try: | ||
| number = int(number) | ||
| except (ValueError, TypeError): | ||
| continue # Safely skip non-numeric values | ||
|
|
||
| if number <= 1: | ||
| continue | ||
|
|
||
| for i in range(2, int(number ** 0.5) + 1): | ||
| if number % i == 0: | ||
| break | ||
| else: | ||
| primes.add(number) | ||
|
|
||
| return ( | ||
| "No prime numbers found." | ||
| if not primes | ||
| else f"{', '.join(str(num) for num in sorted(primes))} are prime numbers." | ||
| ) | ||
|
|
||
| root_agent = Agent( | ||
| model=Ollama(model="llama3.1"), | ||
| name="dice_roll_agent", | ||
| description=( | ||
| "hello world agent that can roll a dice of any number of sides and" | ||
| " check prime numbers." | ||
| ), | ||
| instruction=""" | ||
| You roll dice and answer questions about the outcome of the dice rolls. | ||
| You can roll dice of different sizes. | ||
| You can use multiple tools in parallel by calling functions in parallel(in one request and in one round). | ||
| It is ok to discuss previous dice roles, and comment on the dice rolls. | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
| When you are asked to roll a die, you must call the roll_die tool with the number of sides. Be sure to pass in an integer. Do not pass in a string. | ||
| You should never roll a die on your own. | ||
| When checking prime numbers, call the check_prime tool with a list of integers. Be sure to pass in a list of integers. You should never pass in a string. | ||
| You should not check prime numbers before calling the tool. | ||
| When you are asked to roll a die and check prime numbers, you should always make the following two function calls: | ||
| 1. You should first call the roll_die tool to get a roll. Wait for the function response before calling the check_prime tool. | ||
| 2. After you get the function response from roll_die tool, you should call the check_prime tool with the roll_die result. | ||
| 2.1 If user asks you to check primes based on previous rolls, make sure you include the previous rolls in the list. | ||
| 3. When you respond, you must include the roll_die result from step 1. | ||
| You should always perform the previous 3 steps when asking for a roll and checking prime numbers. | ||
| You should not rely on the previous history on prime results. | ||
| """, | ||
| tools=[ | ||
| roll_die, | ||
| check_prime, | ||
| ], | ||
| ) | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,77 @@ | ||
| # Copyright 2025 Google LLC | ||
| # | ||
| # Licensed under the Apache License, Version 2.0 (the "License"); | ||
| # you may not use this file except in compliance with the License. | ||
| # You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
|
|
||
| import asyncio | ||
| import time | ||
| import warnings | ||
|
|
||
| import agent | ||
| from dotenv import load_dotenv | ||
| from google.adk import Runner | ||
| from google.adk.artifacts.in_memory_artifact_service import InMemoryArtifactService | ||
| from google.adk.cli.utils import logs | ||
| from google.adk.sessions.in_memory_session_service import InMemorySessionService | ||
| from google.adk.sessions.session import Session | ||
| from google.genai import types | ||
|
|
||
| load_dotenv(override=True) | ||
| warnings.filterwarnings('ignore', category=UserWarning) | ||
| logs.log_to_tmp_folder() | ||
|
|
||
|
|
||
| async def main(): | ||
| app_name = 'my_app' | ||
| user_id_1 = 'user1' | ||
| session_service = InMemorySessionService() | ||
| artifact_service = InMemoryArtifactService() | ||
| runner = Runner( | ||
| app_name=app_name, | ||
| agent=agent.root_agent, | ||
| artifact_service=artifact_service, | ||
| session_service=session_service, | ||
| ) | ||
| session_11 = await session_service.create_session( | ||
| app_name=app_name, user_id=user_id_1 | ||
| ) | ||
|
|
||
| async def run_prompt(session: Session, new_message: str): | ||
| content = types.Content( | ||
| role='user', parts=[types.Part.from_text(text=new_message)] | ||
| ) | ||
| print('** User says:', content.model_dump(exclude_none=True)) | ||
| async for event in runner.run_async( | ||
| user_id=user_id_1, | ||
| session_id=session.id, | ||
| new_message=content, | ||
| ): | ||
| if event.content.parts and event.content.parts[0].text: | ||
| print(f'** {event.author}: {event.content.parts[0].text}') | ||
|
|
||
| start_time = time.time() | ||
| print('Start time:', start_time) | ||
| print('------------------------------------') | ||
| await run_prompt(session_11, 'Hi, introduce yourself.') | ||
| await run_prompt( | ||
| session_11, 'Roll a die with 100 sides and check if it is prime' | ||
| ) | ||
| await run_prompt(session_11, 'Roll it again.') | ||
| await run_prompt(session_11, 'What numbers did I get?') | ||
| end_time = time.time() | ||
| print('------------------------------------') | ||
| print('End time:', end_time) | ||
| print('Total time:', end_time - start_time) | ||
|
|
||
|
|
||
| if __name__ == '__main__': | ||
| asyncio.run(main()) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This documentation suggests that the Ollama host can be configured via the
OLLAMA_API_BASEenvironment variable. However, the current implementation of theOllamaclass inollama_llm.pydoes not read from this environment variable. To avoid confusion, the implementation should be updated to support this feature as documented.