Skip to content

venkadesh004/nodegem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Node module for google ecosystem with Gemini AI

We build a ecosystem for google drive service to access that drive using gemini with these we create,update,download,read that drive which can access images and vedios,which can only be send as prompt.There are also some built in parameter,Every prompt we send to the model includes parameter values that control how the model generates a response.The model can generate different results for different parameter values.The most comman model paremeters are:

    Max output token
    Temperature
    topK
    topP
    stop_sequences

Max output tokens:

Specifics the maximum number of tokkens that can be generated in the response.A token is approximately four characters.100 tokens corresponds to roughly 80 words.

Temperature:

The temperature controls sthe degree of randomness in tokens selection.The temperature is used for sampling during responce generation,which occurs when topP and topK are applied.Lower temperatures are good for prompts that require a more deterministic or less open-ended response,while higher temperatures can lead to more diverse or creative results.A temperature of 0 is deterministic,meaning that the highest probabity response is always selected.

topK:

The topK parameter changes how teh model selects tokens for output.A topK of 1 means the selected token is the most probable among all the next thr tokens in the model's vocabulary also called greedy decoding,while a topK of 3 means that the next token is selected from among the 3 most protable using the temperature.For each token selection step,the topK tokens with the highest probabilities are sampled.Tokens are then futher filtered based on topP with the final token selected using temperature sampling.

topP:

The topP parameter changes how the model selects tokens for output.Tokena are selected from the most to least probable untill the sum of their probabilities equals the topP value.For example,if tokens A,B and C have a probability of 0.3,0.2 and 0.1 and the topP values is 0.5,then the model will select either A or B as the next token by using the temperature and exclude C as a candidate.The default topP value is 0.95.

stop_sequence:

Seta stop sequence to tell the model to stop generating content.A stop sequence can be any sequence of characters.Try to avoid using a sequence of chaarcters that may appear in the generated content.

Categories to that makes gemini ,good

Harassment:

Negative or harmful comments targeting identity and/or protected attributes.

Hate speech:

Content that is rude,disrespectful, or profane.

Sexually explicit:

Contains references to sexual acts or other lewd content.

Dangerous:

Promotes,facilitates or encourage harmful acts.

Threshold for google AI studio:

BLOCK_NONE

Always show regardless of probability of unsafe content

BLOCK_ONLY_HIGH

Block when medium or high probability of unsafe content.

BLOCK_MEDIUM_AND_ABOVE

Block when low,medium or high probability of unsafe content.

HARM_BLOCK_THRESHOLD_UNSPECIFIED

Threshold is unspecified,block using default threshold

Probabilities

    Negligible

Content has a negligible probability of being unsafe

    low

Content has a low probability of being unsafe

    Medium

Content has a medium probability of being unsafe

    High

Content has a high probability of being unsafe

For experiencing this we create a react web appliation for a e-commerce

We create a basic web appliaction for e-commerce conatains a login and file upload page with fully validated.From this website we are trying to get the use our developing module for example nowadays cloth making shops for upto the trend becomes most popular.So we taken that as our example business model because that particular brand can't do best or exceptionally well in both clothing and branding theirs to incraese the sales.For being better in branding and searching off we provides keywords that will be useful for both enduser and that particular brand company from that image prompt from that we access from google drive not a direct prompt we access it and give discription and related features about it.So that brand cant be worried about the marketting and branding meanwhile it also helps for the customers to get the expected things in right time. For provide these kinds of services we initialise API keys and access the google drive using that and gemini ai in that ,fetch the both to get results that we expected.Here we transfer all the images and info by streams and get as or stored in buffer for arranging the different colections of data in a better way.For google authentication we use two methods oath 2.0 and service accounts,for that service account we create a private key to access JWT (jeson web tokens) that is used for client verification.These are for service accounts.For outh 2.0 we have to create a new project and we have to redirect the URL's and now we have to download the credentials.Using connectAuthClient ,that provides two forms of security for session management, including the Internet Identity delegation. for that connectAuthClient function we have three parameters which are Tokens_path,Credentials_path,scopes. Here Token_path refers to path loaction of token.js and Credentials_path are refers to path there credentials are been there.Lastly we have scope that refers to the OAuth 2.0 scopes that you might need to request to access Google APIs, depending on the level of access you need. Sensitive scopes require review by Google and have a sensitive indicator on the Google Cloud Console's OAuth consent screen configuration page. Many scopes overlap, so it's best to use a scope that isn't sensitive.If your public application uses scopes that permit access to certain user data, it must complete a verification process. If you see unverified app on the screen when testing your application, you must submit a verification request to remove it.

S.No Method Parameter Return Type Description
1 constructor API_KEY: string, modelName: string void The constructor function which Initializes the Model
2 generateContent prompt: string, stream: boolean Promise Returns the Generated content from the Gemini AI
3 switchModel modelName: string void Switched from one model to another model
4 fileToGenerativePart path: string, mimeType: string JSON Makes the image to buffer structure for prompting
5 useTextAndImage imageParts: string[][], stream: boolean, prompt: string Promise Prompt both image and text
6 changeConfig { maxOutputTokens, temperature, topP, topK, stopSequence }: ModelConfig void Change the Configuration of the Model
7 changeSafetySettings input: Map void Change the Safety settings of the Module
8 loadSavedCredentialsIfExist token_path: string Promise Loads the Token file if exist
9 connectServiceAccount clientKey: string, keyFile: string, privateKey: string, scopes: string[] Promise Connects to the Google service Accounts
10 saveCredentials client: any, credentials_path: string, token_path: string Promise Saves the Token Credentials
11 connectAuthClient token_path: string, credentials_path: string, scopes: string[] Promise Connects to the Google Auth 2.0 Client ID
12 listFiles pageSize: number Promise Lists the Drive files in the Service Account
13 uploadFile fileName: string Promise Uploads the File in the Service Account
14 updateFile fileName: string, fileId: string Promise Updates the File in the Service Account
15 downloadFile fileName: string, fileId: string, listenerFunction: any Promise Downloads a File in the Service Account
16 driveAndPrompt prompt: string, fileName: string, fileId: string Promise Directly Prompting a image file inside a Service Account from Gemini AI
17 returnImageBuffer fileName: string, fileId: string Promise Returning a Image Buffer for the Image
18 returnSnippet userId: string, maxResult: number Promise Returns Mail Snippets from the User Account
19 promptSnippet userId: string, snippetID: string, prompt: string Promise Prompts the Snippets from the User Account
20 sendMail from: string, to: string, subject: string, text: string, html: string, name: string, prompt: string string Sends a Mail along with custom Prompt
21 translateText text: any, targetLang: string Promise Translates a Given Text to the Target Language
22 getBlogData API_KEY: string, bloggerID: string Promise Get the Blog Data from the Blogger Services
23 generateBlogContent prompt: string | string[] string Generates a Blog content depending on the User request
24 getBlogContent API_KEY: string, bloggerID: string, doPrompt: boolean, prompt: string Promise Get the Blog Content and add custom prompt to it

Sample Projects:

A simple ECommerce website that is developed using this module which process the images uploaded, generates description and keywords for the cloths

Urls:

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published