Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: expand best practices #924

Open
wants to merge 5 commits into
base: gh-pages
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
116 changes: 115 additions & 1 deletion _includes/parse-server/best-practice.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,57 @@ Protect all Parse Server endpoints using a Firewall to mitigate the risk of mali
- Use rate-limiting rules for public endpoints, for example limit the number of requests per IP address or per user.
- Use very restrictive rules for private endpoints; for example limit access to Parse Dashboard to your personal network.

### Security-first mindset
dblythy marked this conversation as resolved.
Show resolved Hide resolved

When developing Parse Server, assume that any cloud function or custom trigger is going to be abused by a malicous client. It's best to assume that the client code cannot be trusted. Even if you have written the client code, it's very simple for an attacker to replicate a request to your Parse Server and modify arguments.

The following cloud code is **not recommended** and is an example of poor security practise:
dblythy marked this conversation as resolved.
Show resolved Hide resolved

```js
Parse.Cloud.define('updateEmail', async (req) => {
const user = await new Parse.Query(Parse.User).get(req.params.id, { useMasterKey: true });
if (!user) {
throw 'This user does not exist';
}
user.set('email', req.params.email);
return user.save(null, { useMasterKey: true });
});
```

With this code, an attacker can simply guess a user to sniff out a user object and replace their email, potentially exposing private user data. This could allow the attacker to then request a password reset to the new email, and take-over the account completely.

```js
while (true) {
let id = '';
const chars = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghiklmnopqrstuvwxyz'.split('');
for (let i = 0; i < 10; i++) {
id += chars[Math.floor(Math.random() * chars.length)];
}
try {
const user = await Parse.Cloud.run('updateEmail', {id});
console.log(`Here is a full user object: `, user.toJSON());
return;
} catch (e) {
console.log(`Not a valid id: ${id}`);
}
}
```

Regardless of how secure your Parse Server is with ACLs and CLPs, a few lines of insecure cloud functions can undo it.
dblythy marked this conversation as resolved.
Show resolved Hide resolved

A secure solution would be to ensure that users can only update their email, such as:

```js
Parse.Cloud.define('updateEmail', async (req) => {
req.user.set('email', req.params.email);
await req.user.save(null, { useMasterKey: true });
Copy link
Member

@mtrezza mtrezza Jul 4, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't it be best practice to avoid any operation using useMasterKey: true but instead give only the user write permission in the object ACL? This would mitigate the risk that through a developer mistake in code someone could execute operations because they are using the master key; instead any permission would be bound to the user's session token. Using useMasterKey: true should only be necessary if the object is locked even for the user with read/write limited to master key.

return `Email updated`;
}, {
requireUser: true,
fields: ['email']
});
```

## Optimization

The following is a list of design considerations to optimize data traffic and performance.
Expand All @@ -20,4 +71,67 @@ The following is a list of design considerations to optimize data traffic and pe

### Queries

- Use `select` and `exclude` to transfer only the fields that you need instead of the whole object.
- Use `select` and `exclude` to transfer only the fields that you need instead of the whole object.
- Parallel queries where possible. For example, consider the following objects:

```js
const user = Parse.User.current();
const activity = new Activity();
activity.set('user', user);
const history = new History();
history.set('user', user);
```

If we want to get a users' relevant objects:

```js
const activities = await new Parse.Query(Activity).equalTo('user', user).find();
const histories = await new Parse.Query(History).equalTo('user', user).find();
```

However, this is inefficient as the queries will be ran on by one. We can optimize this with:

```js
const [activities, histories] = await Promise.all([
new Parse.Query(Activity).equalTo('user', user).find(),
new Parse.Query(History).equalTo('user', user).find()
])
```

### Clustering

By default, NodeJS runs JavaScript code on a single thread, meaning that if you are running Parse Server on a multi-core instance, you won't be using its full potential.

Clustering Parse Server is a simple as:

```js
import cluster from "cluster";
import os from "os";
if (cluster.isMaster) {
const count = os.cpus().length;
for (let i = 0; i < count; i++) {
cluster.fork();
}
cluster.on("death", worker => {
console.log(`worker ${worker.id} died. spawning a new process...`);
cluster.fork();
});
cluster.on("exit", worker => {
console.log(`worker ${worker.id} died. spawning a new process...`);
cluster.fork();
});
console.log(`Parse Server started on ${count} clusters`);
} else {
console.log(`Worker #${cluster.worker.id} created.`);
const httpServer = createServer(app);
const api = new ParseServer(config);
await api.start();
app.use("/parse", api.app);
await new Promise(resolve => httpServer.listen(1337, resolve));
await ParseServer.createLiveQueryServer(httpServer, {
redisURL: REDIS_URL,
});
}
```

Parse Server uses an internal cache to store user objects and roles for a short time between requests. If you are clustering, specify a global `cacheAdapter` to your Parse Server configuration to ensure the cache is kept in sync across clusters.