Hi Team - Would be great if you can take a look at the following issue:
Importing langchain_google_genai raises an AttributeError at startup when proto-plus ≥ 1.24.0 and protobuf ≥ 5.x are installed. The service cannot start.
The error originates in the auto-generated file google/ai/generativelanguage_v1beta/types/content.py, which calls proto.module(...) as a callable function. This function alias no longer exists in current versions of proto-plus - the equivalent API is proto.Module (the class, capital M). Because content.py is auto-generated and has not been regenerated against a current proto-plus, the stale call to proto.module() causes an AttributeError on every import.
This affects any project that:
- Uses langchain-google-genai (or google-generativeai
directly)
- Requires protobuf >= 5.x (e.g. due to other dependencies such as opentelemetry-exporter-otlp-proto-grpc,
confluent-kafka, etc.)
- The combination forces proto-plus ≥ 1.24.0 to be resolved, which no longer exposes proto.module as a callable.
The issue cannot be resolved through version pinning alone. Downgrading proto-plus to a version that retains proto.module conflicts with the protobuf >= 5.x requirement. Upgrading google-ai-generativelanguage to newer minor versions does not resolve it — the generated types/content.py in all currently available releases still contains the stale proto.module(...) call.
Full Traceback
Traceback (most recent call last):
File "/venv/bin/uvicorn", line 6, in
sys.exit(main())
...
File "/app/llm_service.py", line 80, in
from langchain_google_genai import ChatGoogleGenerativeAI
File "/venv/lib/python3.11/site-packages/langchain_google_g
enai/init.py", line 58, in
from langchain_google_genai._enums import
HarmBlockThreshold, HarmCategory
File "/venv/lib/python3.11/site-packages/langchain_google_g
enai/_enums.py", line 1, in
import google.ai.generativelanguage_v1beta as genai
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/init.py", line 21, in
from .services.cache_service import
CacheServiceAsyncClient, CacheServiceClient
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/services/cache_service/init.py", line
16, in
from .async_client import CacheServiceAsyncClient
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/services/cache_service/async_client.py",
line 51, in
from
google.ai.generativelanguage_v1beta.services.cache_service
import pagers
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/services/cache_service/pagers.py", line 41,
in
from google.ai.generativelanguage_v1beta.types import
cache_service, cached_content
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/init.py", line 16, in
from .cache_service import (
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/cache_service.py", line 23, in
from google.ai.generativelanguage_v1beta.types import (
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/cached_content.py", line 24, in
from google.ai.generativelanguage_v1beta.types import
content
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/content.py", line 23, in
protobuf = proto.module(
^^^^^^^^^^^^
AttributeError: module 'proto' has no attribute 'module'
Root Cause:
google/ai/generativelanguage_v1beta/types/content.py (and other generated types files) contain:
protobuf = proto.module(
package="google.ai.generativelanguage.v1beta",
marshal=...,
manifest={...},
)
proto.module was the original callable function exported from proto-plus. In current proto-plus, this has been superseded by proto.Module (the class). The auto-generated code has not been regenerated to reflect this API change, so the call fails with AttributeError.
Steps to Reproduce
- Create a fresh Python 3.11 virtual environment
- Install the following: pip install langchain-google-genai google-generativeai>=0.8.0 protobuf>=5.26.0
- Run: from langchain_google_genai import ChatGoogleGenerativeAI
- Observe AttributeError: module 'proto' has no attribute 'module'
Expected Behaviour:
from langchain_google_genai import ChatGoogleGenerativeAI imports successfully when protobuf >= 5.x and proto-plus >= 1.24.0 are installed.
Actual Behaviour:
AttributeError is raised immediately on import, preventing any application that depends on langchain-google-genai from starting.
Suggested Fix:
Regenerate the auto-generated types files in google/ai/generativelanguage_v1beta/types/ against a current version of proto-plus. Specifically, proto.module(...) calls in the generated output should be updated to use proto.Module(...) (the class constructor), which is the current API.
All affected .py files in types/ that contain protobuf = proto.module( need to be regenerated or patched.
Workaround:
Until a fix is released, consumers can guard the import:
try:
from langchain_google_genai import ChatGoogleGenerativeAI
except (ImportError, AttributeError):
ChatGoogleGenerativeAI = None
This allows the application to start and degrade gracefully when the Gemini provider is not available, rather than crashing at boot.
Hi Team - Would be great if you can take a look at the following issue:
Importing langchain_google_genai raises an AttributeError at startup when proto-plus ≥ 1.24.0 and protobuf ≥ 5.x are installed. The service cannot start.
The error originates in the auto-generated file google/ai/generativelanguage_v1beta/types/content.py, which calls proto.module(...) as a callable function. This function alias no longer exists in current versions of proto-plus - the equivalent API is proto.Module (the class, capital M). Because content.py is auto-generated and has not been regenerated against a current proto-plus, the stale call to proto.module() causes an AttributeError on every import.
This affects any project that:
directly)
confluent-kafka, etc.)
The issue cannot be resolved through version pinning alone. Downgrading proto-plus to a version that retains proto.module conflicts with the protobuf >= 5.x requirement. Upgrading google-ai-generativelanguage to newer minor versions does not resolve it — the generated types/content.py in all currently available releases still contains the stale proto.module(...) call.
Full Traceback
Traceback (most recent call last):
File "/venv/bin/uvicorn", line 6, in
sys.exit(main())
...
File "/app/llm_service.py", line 80, in
from langchain_google_genai import ChatGoogleGenerativeAI
File "/venv/lib/python3.11/site-packages/langchain_google_g
enai/init.py", line 58, in
from langchain_google_genai._enums import
HarmBlockThreshold, HarmCategory
File "/venv/lib/python3.11/site-packages/langchain_google_g
enai/_enums.py", line 1, in
import google.ai.generativelanguage_v1beta as genai
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/init.py", line 21, in
from .services.cache_service import
CacheServiceAsyncClient, CacheServiceClient
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/services/cache_service/init.py", line
16, in
from .async_client import CacheServiceAsyncClient
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/services/cache_service/async_client.py",
line 51, in
from
google.ai.generativelanguage_v1beta.services.cache_service
import pagers
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/services/cache_service/pagers.py", line 41,
in
from google.ai.generativelanguage_v1beta.types import
cache_service, cached_content
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/init.py", line 16, in
from .cache_service import (
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/cache_service.py", line 23, in
from google.ai.generativelanguage_v1beta.types import (
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/cached_content.py", line 24, in
from google.ai.generativelanguage_v1beta.types import
content
File "/venv/lib/python3.11/site-packages/google/ai/generati
velanguage_v1beta/types/content.py", line 23, in
protobuf = proto.module(
^^^^^^^^^^^^
AttributeError: module 'proto' has no attribute 'module'
Root Cause:
google/ai/generativelanguage_v1beta/types/content.py (and other generated types files) contain:
protobuf = proto.module(
package="google.ai.generativelanguage.v1beta",
marshal=...,
manifest={...},
)
proto.module was the original callable function exported from proto-plus. In current proto-plus, this has been superseded by proto.Module (the class). The auto-generated code has not been regenerated to reflect this API change, so the call fails with AttributeError.
Steps to Reproduce
Expected Behaviour:
from langchain_google_genai import ChatGoogleGenerativeAI imports successfully when protobuf >= 5.x and proto-plus >= 1.24.0 are installed.
Actual Behaviour:
AttributeError is raised immediately on import, preventing any application that depends on langchain-google-genai from starting.
Suggested Fix:
Regenerate the auto-generated types files in google/ai/generativelanguage_v1beta/types/ against a current version of proto-plus. Specifically, proto.module(...) calls in the generated output should be updated to use proto.Module(...) (the class constructor), which is the current API.
All affected .py files in types/ that contain protobuf = proto.module( need to be regenerated or patched.
Workaround:
Until a fix is released, consumers can guard the import:
try:
from langchain_google_genai import ChatGoogleGenerativeAI
except (ImportError, AttributeError):
ChatGoogleGenerativeAI = None
This allows the application to start and degrade gracefully when the Gemini provider is not available, rather than crashing at boot.