-
Notifications
You must be signed in to change notification settings - Fork 406
CR-215 workbench local storage history #4120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
pd-redis
merged 19 commits into
feature/dynamic-dependencies
from
fe/feature/CR-215-workbench-local-storage-history
Dec 5, 2024
Merged
Changes from all commits
Commits
Show all changes
19 commits
Select commit
Hold shift + click to select a range
6ed0171
add shortcuts in fetch action according to feature flag
pd-redis 5e292d0
initial implementation of localstorage workbench history
pd-redis 0d1564b
fix searching for commands in local history
pd-redis 5e7545b
scope local history to specific redis db id
pd-redis 41a7cdb
add basic implementation of indexedDb storage, used for WB history only.
pd-redis c402003
pass store name directly on update
pd-redis d7fa819
moved indexedDbStorage to separate file
pd-redis 5bf8df6
pause
pd-redis 2cba204
more indexedDbStorage.ts changes to store commands separately
pd-redis 34151e7
refactored wb storage to separate file workbenchStorage.ts
pd-redis 25be9c4
Merge branch 'feature/dynamic-dependencies-public' into fe/feature/CR…
pd-redis 8ed58e2
reverse order of keypaths
pd-redis 4df5680
add callbacks for different operations
pd-redis 98e125d
use config for indexed db name
pd-redis d4bf832
add isNotStored flag to large results
pd-redis a410209
change wb history collection name
pd-redis 6813bca
revert changes to QueryCardCliPlugin.tsx
pd-redis b628e41
revert changes to QueryCard.tsx and workbenchStorage.ts related to to…
pd-redis 171f6ae
set max result size to 1mb
pd-redis File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,319 @@ | ||
import { flatten } from 'lodash' | ||
import { CommandExecution } from 'uiSrc/slices/interfaces' | ||
import { BrowserStorageItem } from 'uiSrc/constants' | ||
import { CommandExecutionStatus } from 'uiSrc/slices/interfaces/cli' | ||
import { getConfig } from 'uiSrc/config' | ||
import { formatBytes } from 'uiSrc/utils' | ||
import { WORKBENCH_HISTORY_MAX_LENGTH } from 'uiSrc/pages/workbench/constants' | ||
|
||
const riConfig = getConfig() | ||
|
||
export class WorkbenchStorage { | ||
private db?: IDBDatabase | ||
|
||
constructor(private readonly dbName: string, private readonly version = 1) { | ||
} | ||
|
||
private initDb(storeName: string) { | ||
return new Promise((resolve, reject) => { | ||
if (!window.indexedDB) { | ||
reject(new Error('indexedDB is not supported')) | ||
return | ||
} | ||
// Let us open our database | ||
const DBOpenRequest = window.indexedDB.open(this.dbName, this.version) | ||
|
||
DBOpenRequest.onerror = (event) => { | ||
event.preventDefault() | ||
reject(DBOpenRequest.error) | ||
console.error('indexedDB open error') | ||
} | ||
|
||
DBOpenRequest.onsuccess = () => { | ||
this.db = DBOpenRequest.result | ||
this.db.onversionchange = (e) => { | ||
// Triggered when the database is modified (e.g. adding an objectStore) or | ||
// deleted (even when initiated by other sessions in different tabs). | ||
// Closing the connection here prevents those operations from being blocked. | ||
// If the database is accessed again later by this instance, the connection | ||
// will be reopened or the database recreated as needed. | ||
(e.target as IDBDatabase)?.close() | ||
} | ||
resolve(this.db) | ||
} | ||
|
||
// This event handles the event whereby a new version of the database needs to be created | ||
// Either one has not been created before, or a new version number has been submitted via the | ||
// window.indexedDB.open line above | ||
// it is only implemented in recent browsers | ||
DBOpenRequest.onupgradeneeded = (event) => { | ||
this.db = DBOpenRequest.result | ||
|
||
this.db.onerror = (event) => { | ||
event.preventDefault() | ||
reject(DBOpenRequest.error) | ||
} | ||
|
||
try { | ||
if (event.newVersion && event.newVersion > event.oldVersion | ||
&& event.oldVersion > 0 | ||
&& this.db.objectStoreNames.contains(storeName)) { | ||
// if there is need to update | ||
this.db.deleteObjectStore(storeName) | ||
} | ||
// Create an objectStore for this database | ||
const objectStore = this.db.createObjectStore(storeName, { keyPath: ['id', 'databaseId'] }) | ||
objectStore.createIndex('dbId', 'databaseId', { unique: false }) | ||
objectStore.createIndex('commandId', 'id', { unique: true }) | ||
} catch (ex) { | ||
if (ex instanceof DOMException && ex?.name === 'ConstraintError') { | ||
console.warn( | ||
`The database "${this.dbName}" has been upgraded from version ${event.oldVersion} to version | ||
${event.newVersion}, but the storage "${storeName}" already exists.`, | ||
) | ||
} else { | ||
throw ex | ||
} | ||
} | ||
} | ||
}) | ||
} | ||
|
||
async getDb(storeName: string) { | ||
if (!this.db) { | ||
await this.initDb(storeName) | ||
} | ||
return this.db | ||
} | ||
|
||
getItem(storeName: string, commandId: string, onSuccess?: () => void, onError?: () => void) { | ||
return new Promise((resolve, reject) => { | ||
try { | ||
this.getDb(storeName).then((db) => { | ||
if (db === undefined) { | ||
reject(new Error('Failed to retrieve item from IndexedDB')) | ||
return | ||
} | ||
const objectStore = db.transaction(storeName, 'readonly')?.objectStore(storeName) | ||
const idbIndex = objectStore?.index('commandId') | ||
const indexReq = idbIndex?.get(commandId) | ||
indexReq.onsuccess = () => { | ||
const value = indexReq.result | ||
onSuccess?.() | ||
resolve(value) | ||
} | ||
indexReq.onerror = () => { | ||
onError?.() | ||
reject(indexReq.error) | ||
} | ||
}) | ||
} catch (e) { | ||
onError?.() | ||
reject(e) | ||
} | ||
}) | ||
} | ||
|
||
getItems(storeName: string, dbId: string, onSuccess?: () => void, onError?: () => void) { | ||
return new Promise((resolve, reject) => { | ||
try { | ||
this.getDb(storeName).then((db) => { | ||
if (db === undefined) { | ||
reject(new Error('Failed to retrieve item from IndexedDB')) | ||
return | ||
} | ||
const objectStore = db.transaction(storeName, 'readonly')?.objectStore(storeName) | ||
const idbIndex = objectStore?.index('dbId') | ||
const indexReq = idbIndex?.getAll(dbId) | ||
indexReq.onsuccess = () => { | ||
const values = indexReq.result | ||
onSuccess?.() | ||
if (values && values.length > 0) { | ||
resolve(values) | ||
} else { | ||
resolve([]) | ||
} | ||
} | ||
indexReq.onerror = () => { | ||
onError?.() | ||
reject(indexReq.error) | ||
} | ||
}) | ||
} catch (e) { | ||
onError?.() | ||
reject(e) | ||
} | ||
}) | ||
} | ||
|
||
setItem(storeName: string, value: any, onSuccess?: () => void, onError?: () => void): Promise<void> { | ||
return new Promise((resolve, reject) => { | ||
try { | ||
this.getDb(storeName).then((db) => { | ||
if (db === undefined) { | ||
reject(new Error('Failed to set item in IndexedDB')) | ||
return | ||
} | ||
const transaction = db.transaction(storeName, 'readwrite') | ||
const req = transaction?.objectStore(storeName)?.put(value) | ||
transaction.oncomplete = () => { | ||
onSuccess?.() | ||
resolve() | ||
} | ||
transaction.onerror = () => { | ||
onError?.() | ||
reject(req?.error) | ||
} | ||
}) | ||
} catch (e) { | ||
reject(e) | ||
} | ||
}) | ||
} | ||
|
||
removeItem( | ||
storeName: string, | ||
dbId: string, | ||
commandId: string, | ||
onSuccess?: () => void, | ||
onError?: () => void | ||
): Promise<string | void> { | ||
return new Promise((resolve, reject) => { | ||
try { | ||
this.getDb(storeName).then((db) => { | ||
if (db === undefined) { | ||
reject(new Error('Failed to remove item from IndexedDB')) | ||
return | ||
} | ||
const transaction = db.transaction(storeName, 'readwrite') | ||
const req = transaction.objectStore(storeName)?.delete([commandId, dbId]) | ||
|
||
transaction.oncomplete = () => { | ||
onSuccess?.() | ||
resolve(commandId) | ||
} | ||
|
||
transaction.onerror = () => { | ||
onError?.() | ||
reject(req?.error) | ||
} | ||
}) | ||
} catch (e) { | ||
onError?.() | ||
reject(e) | ||
} | ||
}) | ||
} | ||
|
||
clear(storeName: string, dbId: string, onSuccess?: () => void, onError?: () => void): Promise<void> { | ||
return new Promise((resolve, reject) => { | ||
try { | ||
this.getDb(storeName).then((db) => { | ||
if (db === undefined) { | ||
reject(new Error('Failed to clear items in IndexedDB')) | ||
return | ||
} | ||
|
||
const objectStore = db.transaction(storeName, 'readwrite')?.objectStore(storeName) | ||
const idbIndex = objectStore?.index('dbId') | ||
const indexReq = idbIndex?.openCursor(dbId) | ||
indexReq.onsuccess = () => { | ||
const cursor = indexReq.result | ||
onSuccess?.() | ||
if (cursor) { | ||
cursor.delete() | ||
cursor.continue() | ||
} else { | ||
// either deleted all items or there were none | ||
resolve() | ||
} | ||
} | ||
indexReq.onerror = () => { | ||
onError?.() | ||
reject(indexReq.error) | ||
} | ||
}) | ||
} catch (e) { | ||
onError?.() | ||
reject(e) | ||
} | ||
}) | ||
} | ||
} | ||
|
||
export const wbHistoryStorage = new WorkbenchStorage(riConfig.app.indexedDbName, 2) | ||
|
||
type CommandHistoryType = CommandExecution[] | ||
|
||
export async function getLocalWbHistory(dbId: string) { | ||
try { | ||
const history = await wbHistoryStorage.getItems(BrowserStorageItem.wbCommandsHistory, dbId) as CommandHistoryType | ||
|
||
return history || [] | ||
} catch (e) { | ||
console.error(e) | ||
return [] | ||
} | ||
} | ||
|
||
export function saveLocalWbHistory(commandsHistory: CommandHistoryType) { | ||
try { | ||
const key = BrowserStorageItem.wbCommandsHistory | ||
return Promise.all(flatten(commandsHistory.map((chItem) => wbHistoryStorage.setItem(key, chItem)))) | ||
} catch (e) { | ||
console.error(e) | ||
return null | ||
} | ||
} | ||
|
||
async function cleanupDatabaseHistory(dbId: string) { | ||
const commandsHistory: CommandHistoryType = await getLocalWbHistory(dbId) | ||
let size = 0 | ||
// collect items up to maxItemsPerDb | ||
const update = commandsHistory.reduce((acc, commandsHistoryElement) => { | ||
if (size >= WORKBENCH_HISTORY_MAX_LENGTH) { | ||
return acc | ||
} | ||
size++ | ||
acc.push(commandsHistoryElement) | ||
return acc | ||
}, [] as CommandHistoryType) | ||
// clear old items | ||
await clearCommands(dbId) | ||
// save | ||
await saveLocalWbHistory(update) | ||
} | ||
|
||
export async function addCommands(data: CommandExecution[]) { | ||
// Store command results in local storage! | ||
const storedData = data.map((item) => { | ||
// Do not store command execution result that exceeded limitation | ||
if (JSON.stringify(item.result).length > riConfig.workbench.maxResultSize) { | ||
item.result = [ | ||
{ | ||
status: CommandExecutionStatus.Success, | ||
response: `Results have been deleted since they exceed ${formatBytes(riConfig.workbench.maxResultSize)}. | ||
pd-redis marked this conversation as resolved.
Show resolved
Hide resolved
|
||
Re-run the command to see new results.`, | ||
}, | ||
] | ||
} | ||
return item | ||
}) | ||
await saveLocalWbHistory(storedData) | ||
const [{ databaseId }] = storedData | ||
return cleanupDatabaseHistory(databaseId) | ||
} | ||
|
||
export async function removeCommand(dbId: string, commandId: string) { | ||
// Delete command from local storage?! | ||
pd-redis marked this conversation as resolved.
Show resolved
Hide resolved
|
||
await wbHistoryStorage.removeItem(BrowserStorageItem.wbCommandsHistory, dbId, commandId) | ||
} | ||
|
||
export async function clearCommands(dbId: string) { | ||
await wbHistoryStorage.clear(BrowserStorageItem.wbCommandsHistory, dbId) | ||
} | ||
|
||
export async function findCommand(commandId: string) { | ||
// Fetch command from local storage | ||
return wbHistoryStorage.getItem(BrowserStorageItem.wbCommandsHistory, commandId) | ||
} |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.