Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Updates to Preview Branch (introduce-spark-accounts) ↗︎
Tasks are run on every commit but only new migration files are pushed.
❌ Branch Error • Wed, 05 Nov 2025 22:55:08 UTC View logs for this Workflow Run ↗︎. |
4c7939c to
e89e243
Compare
Putting it on the account would allow us to have different accounts for different networks. How likely that is? |
supabase/migrations/20251125205840_add-spark-and-allow-one-account.sql
Outdated
Show resolved
Hide resolved
| unit: getDefaultUnit(commonData.currency), | ||
| }), | ||
| network: details.network, | ||
| isOnline: true, |
There was a problem hiding this comment.
should we make it offline if call to get balance fails?
There was a problem hiding this comment.
Right now the wallet will crash if spark wallet fails to initialize. Instead of setting offline if get balance fails, would it be better to make it so that the wallet doesn't crash if initialization fails, and instead sets offline?
Maybe we should do both.
Btw, sometimes if the Spark infrastructure is having problems while we are in the middle of a receive the SDK will throw and crash the app. It seems like the internal logic of the SDK is throwing when it tries to claim the transfer, but I don't remember exactly
There was a problem hiding this comment.
Right now the wallet will crash if spark wallet fails to initialize. Instead of setting offline if get balance fails, would it be better to make it so that the wallet doesn't crash if initialization fails, and instead sets offline?
Yeah I would say it's much nicer to show it as offline (Spark being offline shouldn't prevent me from spending my gift card for example). The problem is that we will not know the balance in that case. We could treat it as 0 but users might panic in that case as they would think they lost the money. Alternative is to store the last known balance to local storage (or even our db under account details) and use that and if not present show something other than number.
Btw, sometimes if the Spark infrastructure is having problems while we are in the middle of a receive the SDK will throw and crash the app. It seems like the internal logic of the SDK is throwing when it tries to claim the transfer, but I don't remember exactly
what exactly throws? can we catch it and where?
There was a problem hiding this comment.
I don't remember, I will have to check next time it happens
supabase/migrations/20251125205840_add-spark-and-allow-one-account.sql
Outdated
Show resolved
Hide resolved
| unit: getDefaultUnit(commonData.currency), | ||
| }), | ||
| network: details.network, | ||
| isOnline: true, |
There was a problem hiding this comment.
Right now the wallet will crash if spark wallet fails to initialize. Instead of setting offline if get balance fails, would it be better to make it so that the wallet doesn't crash if initialization fails, and instead sets offline?
Maybe we should do both.
Btw, sometimes if the Spark infrastructure is having problems while we are in the middle of a receive the SDK will throw and crash the app. It seems like the internal logic of the SDK is throwing when it tries to claim the transfer, but I don't remember exactly
There was a problem hiding this comment.
I just looked at this lockfile and saw how much crap was added. Bundlephobia gives an install error for this package, so I asked claude to do some digging.
TLDR; "Spark SDK adds roughly 5.4 MB raw / 3.0 MB gzipped to your client bundle - nearly a 4x increase"
Here's the full convo with some tables comparing with and without spark. The last question I asked output a breakdown of the different packages.
I'm just bringing this up because its very large, but I'm not sure there's anything we can really do. Claude says it pulls in 300-500 kB of node dependencies that are irrelevant for the browser. I think the best we can do is complain to the spark team
There was a problem hiding this comment.
TLDR; "Spark SDK adds roughly 5.4 MB raw / 3.0 MB gzipped to your client bundle - nearly a 4x increase"
That's a lot. Did you try doing a prod build before and after adding it to confirm the num? We should do that before contacting Spark
There was a problem hiding this comment.
Yea this analysis was from doing bun run build before and after adding it
There was a problem hiding this comment.
lets also analyse what this means for the app load on slower (3g and 4g) network. If the difference in load time is not huge I guess we can just ignore it for now
There was a problem hiding this comment.
I did this by emptying the cache and hard reloading then looking at the "Finish time" in the network tab on Brave
| Network Speed | Without Spark (ms) | With Spark (ms) | Spark Bundle Load Time (ms) |
|---|---|---|---|
| 3G | 57,000 | 100,000 | 65,000 |
| Slow 4G | 20,000 | 30,000 | 18,000 |
| Fast 4G | 6,000 | 7,690 | 3,600 |
| No Throttling | 1,700 | 2,500 | 37 |
There was a problem hiding this comment.
what is Spark Bundle Load Time?
You can do this better with Chrome's lighthouse performance analysis (Chrome dev tools -> Lighthouse tab). There you can see how long does it actually take to load the page, for page to become interactive etc. These are better metrics then just checking the network time because there could be some slow paths in the code that is being executed as well.
There was a problem hiding this comment.
what is Spark Bundle Load Time?
The time I observed in the network tab for what I saw of the Spark SDK being downloaded. I see two things on load, one is a spark index.js file and the other is like http://localhost:3000/assets/spark-{some-hash}.js
You can do this better with Chrome's lighthouse performance analysis
Lighthouse says it takes too long without spark on fast 4G and won't give me the full results, so here's with the lighthouse default "Simulated throttling":
Without Spark
With Spark
Seems unlikely to me unless Spark started doing stuff on other chains (ie. solana mainnet), but hopefully we wouldn't need to use that even if they did. Otherwise the only reason I see for not using mainnet is testing/dev |
|
balance doesn't get updated for me. I send 2 sats from Strike and it worked. I got the toast success message but balance was never udpated. Had to reload the app |
lets just keep it as is |
e89e243 to
c2e84c9
Compare
|
I rebased it on top of master to resolve conflicts. |
If we just keep it as is, then does that mean that we need to create the account before initializing the spark wallet? If we always use mainnet, then we can initialize spark wallet along with the user's keys because all that is needed is the seed from open secret. |
supabase/migrations/20251125205840_add-spark-and-allow-one-account.sql
Outdated
Show resolved
Hide resolved
supabase/migrations/20251125205840_add-spark-and-allow-one-account.sql
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
lets also analyse what this means for the app load on slower (3g and 4g) network. If the difference in load time is not huge I guess we can just ignore it for now
e947255 to
b797a36
Compare
| export async function getSparkIdentityPublicKeyFromMnemonic( | ||
| mnemonic: string, | ||
| network: NetworkType, | ||
| accountNumber?: number, |
There was a problem hiding this comment.
Would it be better to make this function not take the network and instead require the accountNumber to be provided? The reason I did it this way is because the spark sdk is not very transparent on which account numbers are used, but it might be better to create a new getDefaultAccountNumberForNetwork helper that needs to be called first then pass that account number to this function?
So I'd change the signature to :
async function getSparkIdentityPublicKeyFromMnemonic(
mnemonic: string,
accountNumber: number,
): Promise<string>| const cardContent = | ||
| accountType === 'spark' | ||
| ? 'Spark is offline. Your balance will be shown when you are online again.' | ||
| : 'Account is offline. Your balance will be shown when you are online again.'; |
There was a problem hiding this comment.
don't we show cashu balance even if offline?
There was a problem hiding this comment.
Yea. I just liked this so that the parent component doesn't need to check the account type. Do you have a suggestion to do it differently? We could call this SparkBalanceOfflineHoverCard and then check the account type whenever its used. Or maybe we don't even need to check because only spark balance will be null, but that seems too implicit.
| export async function getSparkIdentityPublicKeyFromMnemonic( | ||
| mnemonic: string, | ||
| network: NetworkType, | ||
| accountNumber?: number, |
This is a work in progress spark implementation based on this GDoc.
I've broken each piece up into their own commit
What I've Done
What to do
transfer:claimedevent for this, but that would only work for receivesQuestions