Skip to content

Commit

Permalink
Merge pull request #50 from genai-impact/feat/add-google-models
Browse files Browse the repository at this point in the history
Feat/add google models
  • Loading branch information
adrienbanse committed Jun 27, 2024
2 parents ad759eb + 701a3cd commit 808f314
Show file tree
Hide file tree
Showing 14 changed files with 963 additions and 31 deletions.
1 change: 1 addition & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ EcoLogits currently supports the following providers:

- `anthropic`
- `cohere`
- `google-generativeai`
- `huggingface-hub` (Hugging Face Inference Endpoints)
- `mistralai`
- `openai`
Expand Down
2 changes: 2 additions & 0 deletions docs/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
|------------------|------------------------|----------------------------------------------------------------------------------------|
| Anthropic | `anthropic` | [Guide for Anthropic :octicons-link-16:](tutorial/providers/anthropic.md) |
| Cohere | `cohere` | [Guide for Cohere :octicons-link-16:](tutorial/providers/cohere.md) |
| Google Gemini | `google-generativeai` | [Guide for Google Gemini :octicons-link-16:](tutorial/providers/google.md) |
| Hugging Face Hub | `huggingface-hub` | [Guide for Hugging Face Hub :octicons-link-16:](tutorial/providers/huggingface_hub.md) |
| Mistral AI | `mistralai` | [Guide for Mistral AI :octicons-link-16:](tutorial/providers/mistralai.md) |
| OpenAI | `openai` | [Guide for OpenAI :octicons-link-16:](tutorial/providers/openai.md) |
Expand All @@ -17,6 +18,7 @@
|-----------------|:----------------------------------:|:---------------------------------:|:---------------------------------:|:---------------------------------:|
| Anthropic | :material-checkbox-marked-circle: | :material-check-circle-outline: | :material-checkbox-marked-circle: | :material-check-circle-outline: |
| Cohere | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |
| Google Gemini | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |
| HuggingFace Hub | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |
| Mistral AI | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |
| OpenAI | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: | :material-checkbox-marked-circle: |
Expand Down
120 changes: 120 additions & 0 deletions docs/tutorial/providers/google.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,120 @@
# Google Gemini

This guide is focuses on the integration of :seedling: **EcoLogits** with the [Google Gemini official python client :octicons-link-external-16:](https://github.com/google-gemini/generative-ai-python).

Official links:

* Repository: [:simple-github: google-gemini/generative-ai-python](https://github.com/google-gemini/generative-ai-python)
* Documentation: [:material-file-document: ai.google.dev](https://ai.google.dev/gemini-api/docs?hl=fr)


## Installation

To install EcoLogits along with all necessary dependencies for compatibility with the Google Gemini client, please use the `google-generativeai` extra-dependency option as follows:

```shell
pip install ecologits[google-generativeai]
```

This installation command ensures that EcoLogits is set up with the specific libraries required to interface seamlessly with Google Gemini Python client.


## Chat Completions

### Example

Integrating EcoLogits with your applications does not alter the standard outputs from the API responses. Instead, it enriches them by adding the `Impacts` object, which contains detailed environmental impact data.

=== "Sync"

```python
from ecologits import EcoLogits
import google.generativeai as genai

# Initialize EcoLogits
EcoLogits.init()

# Ask something to Google Gemini
genai.configure(api_key="<GOOGLE_API_KEY>")
model = genai.GenerativeModel("gemini-1.5-flash")
response = model.generate_content("Write a story about a magic backpack.")

# Get estimated environmental impacts of the inference
print(response.impacts)
```

=== "Async"

```python
import asyncio
from ecologits import EcoLogits
import google.generativeai as genai

# Initialize EcoLogits
EcoLogits.init()

# Ask something to Google Gemini in async mode
async def main() -> None:
genai.configure(api_key="<GOOGLE_API_KEY>")
model = genai.GenerativeModel("gemini-1.5-flash")
response = await model.generate_content_async(
"Write a story about a magic backpack."
)

# Get estimated environmental impacts of the inference
print(response.impacts)

asyncio.run(main())
```

### Streaming example

**In streaming mode, the impacts are calculated incrementally**, which means you don't need to sum the impacts from each data chunk. Instead, the impact information in the last chunk reflects the total cumulative environmental impacts for the entire request.

=== "Sync"

```python
from ecologits import EcoLogits
import google.generativeai as genai

# Initialize EcoLogits
EcoLogits.init()

# Ask something to Google Gemini in streaming mode
genai.configure(api_key="<GOOGLE_API_KEY>")
model = genai.GenerativeModel("gemini-1.5-flash")
stream = model.generate_content(
"Write a story about a magic backpack.",
stream=True
)

# Get cumulative estimated environmental impacts of the inference
for chunk in stream:
print(chunk.impacts)
```

=== "Async"

```python
import asyncio
from ecologits import EcoLogits
import google.generativeai as genai

# Initialize EcoLogits
EcoLogits.init()

# Ask something to Google Gemini in streaming and async mode
async def main() -> None:
genai.configure(api_key="<GOOGLE_API_KEY>")
model = genai.GenerativeModel("gemini-1.5-flash")
stream = await model.generate_content_async(
"Write a story about a magic backpack.",
stream=True
)

# Get cumulative estimated environmental impacts of the inference
async for chunk in stream:
print(chunk.impacts)

asyncio.run(main())
```
5 changes: 4 additions & 1 deletion ecologits/data/models.csv
Original file line number Diff line number Diff line change
Expand Up @@ -108,4 +108,7 @@ huggingface_hub,databricks/dbrx-instruct,132,36,,https://huggingface.co/databric
huggingface_hub,databricks/dolly-v1-6b,6,6,,https://huggingface.co/databricks/dolly-v1-6b
huggingface_hub,databricks/dolly-v2-12b,12,12,,https://huggingface.co/databricks/dolly-v2-12b
huggingface_hub,databricks/dolly-v2-7b,7,7,,https://huggingface.co/databricks/dolly-v2-7b
huggingface_hub,databricks/dolly-v2-3b,3,3,,https://huggingface.co/databricks/dolly-v2-3b
huggingface_hub,databricks/dolly-v2-3b,3,3,,https://huggingface.co/databricks/dolly-v2-3b
google,gemini-1.5-flash,20;70,20;70,model_architecture_not_released,https://deepmind.google/technologies/gemini/flash/
google,gemini-1.5-pro,440,55;220,model_architecture_not_released,https://deepmind.google/technologies/gemini/pro/
google,gemini-1.0-pro,20;70,20;70,model_architecture_not_released,https://deepmind.google/technologies/gemini/pro/
9 changes: 9 additions & 0 deletions ecologits/ecologits.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,7 @@ def init_instruments() -> None:
init_mistralai_instrumentor()
init_huggingface_instrumentor()
init_cohere_instrumentor()
init_google_instrumentor()


def init_openai_instrumentor() -> None:
Expand Down Expand Up @@ -92,3 +93,11 @@ def init_cohere_instrumentor() -> None:

instrumentor = CohereInstrumentor()
instrumentor.instrument()


def init_google_instrumentor() -> None:
if importlib.util.find_spec("google") is not None:
from ecologits.tracers.google_tracer import GoogleInstrumentor

instrumentor = GoogleInstrumentor()
instrumentor.instrument()
35 changes: 22 additions & 13 deletions ecologits/model_repository.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ class Providers(Enum):
openai = "openai"
huggingface_hub = "huggingface_hub"
cohere = "cohere"
google = "google"


class Warnings(Enum):
Expand Down Expand Up @@ -43,22 +44,28 @@ def find_model(self, provider: str, model_name: str) -> Optional[Model]:
@classmethod
def from_csv(cls, filepath: Optional[str] = None) -> "ModelRepository":
if filepath is None:
filepath = os.path.join(os.path.dirname(os.path.realpath(__file__)), "data", "models.csv")
filepath = os.path.join(
os.path.dirname(os.path.realpath(__file__)), "data", "models.csv"
)
models = []
with open(filepath) as fd:
csv = DictReader(fd)
for row in csv:
total_parameters = None
total_parameters_range = None
if ";" in row["total_parameters"]:
total_parameters_range = [float(p) for p in row["total_parameters"].split(";")]
total_parameters_range = [
float(p) for p in row["total_parameters"].split(";")
]
elif row["total_parameters"] != "":
total_parameters = float(row["total_parameters"])

active_parameters = None
active_parameters_range = None
if ";" in row["active_parameters"]:
active_parameters_range = [float(p) for p in row["active_parameters"].split(";")]
active_parameters_range = [
float(p) for p in row["active_parameters"].split(";")
]
elif row["active_parameters"] != "":
active_parameters = float(row["active_parameters"])

Expand All @@ -70,16 +77,18 @@ def from_csv(cls, filepath: Optional[str] = None) -> "ModelRepository":
if row["sources"] != "":
sources = row["sources"].split(";")

models.append(Model(
provider=Providers(row["provider"]).name,
name=row["name"],
total_parameters=total_parameters,
active_parameters=active_parameters,
total_parameters_range=total_parameters_range,
active_parameters_range=active_parameters_range,
warnings=warnings,
sources=sources
))
models.append(
Model(
provider=Providers(row["provider"]).name,
name=row["name"],
total_parameters=total_parameters,
active_parameters=active_parameters,
total_parameters_range=total_parameters_range,
active_parameters_range=active_parameters_range,
warnings=warnings,
sources=sources,
)
)
return cls(models)


Expand Down
Loading

0 comments on commit 808f314

Please sign in to comment.