You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The email is required for all the auth methods because it is used to link your account to all auth methods. the email will not be exposed in the api (or anywhere really).
Pages
Home
Hero
Title: Nest
Sub: The immutable module registry for Deno
Description
Features
Get started
Standard modules
User
Avatar
Username
Name
tabs
Modules
settings
[ pending ]
X
Search
Featured modules
All modules (default sort: updated)
Module
Name
Tabs
Code
Version selector
Link for the latest version
Nest URL breadcrumbs
Files
Link to raw (arweave) version
Link to type docs
Highlighted code
Insights
Full Name
Description
Downloads (chart for a week)
Dependencies (Maybe "dependents" too?)
Licence
Quality
Settings
[ pending ]
Blog
Minimal MDX blog
For detailed release notes and announcements
Post
Title
Date
Read time
Content
Edit link
Orgs
Hero with logo, name and description
Link to their homepage
Link to their GitHub/GitLab
List of all modules
Routes
nest.land # home pagenest.land/u # redirect to current user's profilenest.land/u/:user # user profile & list of modulesnest.land/u/:user/settings # user settingsnest.land/x # modulesnest.land/x/:user # redirect to /u/:usernest.land/x/:user/:module # redirect to latest versionnest.land/x/:user/:module/:filepath # redirect to latest versionnest.land/x/:user/:module@:version # details for specific versionnest.land/x/:user/:module@:version/:filepath # filepath for specific versionnest.land/- # redirect to /xnest.land/-/:module # vanity url for details for latest version (redirect)nest.land/-/:module/:filepath # vanity url for filepath for latest version (redirect)nest.land/-/:module@:version # vanity url for details for specific versionnest.land/-/:module@:version/:filepath # vanity url for filepath for specific versionnest.land/docs # homepage for docsnest.land/docs/nest # website docsnest.land/docs/eggs # eggs docsnest.land/docs/cli # nest cli docsnest.land/docs/api # api docsnest.land/docs/terms # usage policies, etc.nest.land/blog # latest posts and searchnest.land/blog/:slug # Blog postorgs.land # info about the service<org>.orgs.land # organisation profile and modules list<org>.orgs.land/:module # details for latest version<org>.orgs.land/:module/:filepath # filepath for latest version<org>.orgs.land/:module@:version # details for specific version<org>.orgs.land/:module@:version/:filepath # filepath for specific version
Notes
Why Next.js and not Vue/Nuxt or Svelte/Sapper?
Incremental Static Regeneration (IRS)
We use Vercel for most of our user-facing infrastructure and Vercel has native support for Next.js
We also depend on Vercel's sponsorship and as per the sponsorship conditions, we cannot use SSR (Nuxt, Sapper) and using client-side rendering greatly impacts performance (and obviously SEO).
Also, Next.js has "API Routes" features which can be used to create serverless functions to go alongside the website.
Because we are already using Supabase for our backend and it handles everything.
Why /x/:user/:module and not /x/:module?
People have to be able to publish a variants of an already existing module with the same name but under their account. Similar to how GitHub does it. you can have "forks" of an already existing project.
There are only so many sensibly short combinations of alpha numeric characters that make sense as a name.
It also helps with "name squatting" and "typo squatting" which have been an issue with npm in the recent past.
This address will be provided to good quality and trustable modules. So whenever you see a module on that address, you know it's good. To get a vanity URL, a module will have to meet certain criteria that ensures code quality and authenticity. The vanity address will be granted after a manual review.
Criterion (in discussion):
Properly formated
properly linted
Properly documented
Includes license file
includes a lock file
90%+ test coverage
Authentication
Primary login would be a email/password pair (paired with HCaptcha).
Secondary options would be to use GitHub, GitLab or a "Magic Link" to login.
Emails are used to identify you if you use multiple login methods.
But what about GDPR?
GDPR doesn't restrict the collection of user info. The only thing restricted is the sharing of that info with third parties without the user's explicit consent.
Module Analytics
this will be moved to it's own rfc once we settle on a solution
Module import analytics may be the hardest part to implement correctly. We absolutely don't want someone to just do watch curl <Module URL> and fake the numbers.
Checking user agent for Deno won't work cuz you can do watch deno cache -r <Module URL>
So we need the analytics to be bound to the IP address of the user and have a cooldown before that IP can be counted as a real user. I'd suggest one or two days cuz you don't need to refresh your cache that often.
But with that approach, there is a problem of people abusing the CI/CD services as every instance has a separate IP. You could write a cron job to deno cache the URL every minute.
An obvious question arises here:
How does NPM and other registries do it? they got legit download metrics, right?
tl;dr: they count every HTTP 200 request as a download.
Same for GitHub, PyPi and almost all kinds of registries. This is because they didn't need it to be accurate.
Now someone would ask,
how does YouTube count views? I tried some bots, they filter it out just fine.
Agreed. So does Google analytics or any analytics service for that matter. But the two products and distribution methods are fundamentally different.
They have the advantage of being on a website that can run javascript code on the browser to figure out if the client is a bot. an HTTP server has no such feature.
This is the whole request that is sent to the URL from Deno when caching or running a remote file:
There can be no captcha, no cookie, no cursor tracking, nothing to differentiate a bot from a user. At this point I think we should look for another metric for deciding the payout.
misc
Module authors can apply to manually verify a module. Only verified modules will be featured at the top of the "modules" page. "Verification" here means that we check if the module isn't malicious or a "not useful enough" module (looking at you isOdd()).
Lazy load the lists only on user input i.e. when the client presses "Load more" (ref. design).
The text was updated successfully, but these errors were encountered:
Website
Design: figma.com/file/BfYs1TMPaEx8Mr9haEPE6H (incomplete)
Preview: figma.com/proto/BfYs1TMPaEx8Mr9haEPE6H
Stack
UI
Auth
The email is required for all the auth methods because it is used to link your account to all auth methods. the email will not be exposed in the api (or anywhere really).
Pages
Home
User
X
Module
Blog
Post
Orgs
Routes
Notes
PS: Why Next.js is Great for egghead.io - Joel Hooks
Because we are already using Supabase for our backend and it handles everything.
People have to be able to publish a variants of an already existing module with the same name but under their account. Similar to how GitHub does it. you can have "forks" of an already existing project.
There are only so many sensibly short combinations of alpha numeric characters that make sense as a name.
It also helps with "name squatting" and "typo squatting" which have been an issue with npm in the recent past.
ref:
This address will be provided to good quality and trustable modules. So whenever you see a module on that address, you know it's good. To get a vanity URL, a module will have to meet certain criteria that ensures code quality and authenticity. The vanity address will be granted after a manual review.
Criterion (in discussion):
Authentication
Emails are used to identify you if you use multiple login methods.
GDPR doesn't restrict the collection of user info. The only thing restricted is the sharing of that info with third parties without the user's explicit consent.
Module Analytics
Module import analytics may be the hardest part to implement correctly. We absolutely don't want someone to just do
watch curl <Module URL>
and fake the numbers.Checking user agent for Deno won't work cuz you can do
watch deno cache -r <Module URL>
So we need the analytics to be bound to the IP address of the user and have a cooldown before that IP can be counted as a real user. I'd suggest one or two days cuz you don't need to refresh your cache that often.
But with that approach, there is a problem of people abusing the CI/CD services as every instance has a separate IP. You could write a cron job to deno cache the URL every minute.
An obvious question arises here:
Nope. NPM themselves claim that the download counts are not accurate and having accurate analytics is hard.
tl;dr: they count every
HTTP 200
request as a download.Same for GitHub, PyPi and almost all kinds of registries. This is because they didn't need it to be accurate.
Now someone would ask,
Agreed. So does Google analytics or any analytics service for that matter. But the two products and distribution methods are fundamentally different.
They have the advantage of being on a website that can run javascript code on the browser to figure out if the client is a bot. an HTTP server has no such feature.
This is the whole request that is sent to the URL from Deno when caching or running a remote file:
There can be no captcha, no cookie, no cursor tracking, nothing to differentiate a bot from a user. At this point I think we should look for another metric for deciding the payout.
misc
Module authors can apply to manually verify a module. Only verified modules will be featured at the top of the "modules" page. "Verification" here means that we check if the module isn't malicious or a "not useful enough" module (looking at you
isOdd()
).Lazy load the lists only on user input i.e. when the client presses "Load more" (ref. design).
The text was updated successfully, but these errors were encountered: