Skip to content

Commit

Permalink
refactor: rename meassurePerformance to measureRenders (#433)
Browse files Browse the repository at this point in the history
  • Loading branch information
mdjastrzebski committed Feb 22, 2024
1 parent f0b7e66 commit 84abb74
Show file tree
Hide file tree
Showing 13 changed files with 99 additions and 73 deletions.
7 changes: 7 additions & 0 deletions .changeset/chatty-paws-turn.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
'@callstack/reassure-measure': minor
'reassure': minor
'test-app-native': minor
---

- Rename `measurePerformance` to `measureRenders`.
26 changes: 13 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,11 +94,11 @@ Now that the library is installed, you can write your first test scenario in a f

```ts
// ComponentUnderTest.perf-test.tsx
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { ComponentUnderTest } from './ComponentUnderTest';

test('Simple test', async () => {
await measurePerformance(<ComponentUnderTest />);
await measureRenders(<ComponentUnderTest />);
});
```

Expand All @@ -111,7 +111,7 @@ This test will measure render times of `ComponentUnderTest` during mounting and
If your component contains any async logic or you want to test some interaction, you should pass the `scenario` option:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { screen, fireEvent } from '@testing-library/react-native';
import { ComponentUnderTest } from './ComponentUnderTest';

Expand All @@ -121,7 +121,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand All @@ -130,7 +130,7 @@ The body of the `scenario` function is using familiar React Native Testing Libra
In case of using a version of React Native Testing Library lower than v10.1.0, where [`screen` helper](https://callstack.github.io/react-native-testing-library/docs/api/#screen) is not available, the `scenario` function provides it as its first argument:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { fireEvent } from '@testing-library/react-native';

test('Test with scenario', async () => {
Expand All @@ -139,7 +139,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand Down Expand Up @@ -352,23 +352,23 @@ Looking at the example, you can notice that test scenarios can be assigned to ce

### Measurements

#### `measurePerformance` function
#### `measureRenders` function

Custom wrapper for the RNTL `render` function responsible for rendering the passed screen inside a `React.Profiler` component,
measuring its performance and writing results to the output file. You can use the optional `options` object that allows customizing aspects
of the testing

```ts
async function measurePerformance(
async function measureRenders(
ui: React.ReactElement,
options?: MeasureOptions,
options?: MeasureRendersOptions,
): Promise<MeasureResults> {
```

#### `MeasureOptions` type
#### `MeasureRendersOptions` type

```ts
interface MeasureOptions {
interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
wrapper?: React.ComponentType<{ children: ReactElement }>;
Expand Down Expand Up @@ -451,10 +451,10 @@ function configure(customConfig: Partial<Config>): void;

The `configure` function can override the default config parameters.

#### `resetToDefault` function
#### `resetToDefaults` function

```ts
resetToDefault(): void
resetToDefaults(): void
```

Reset the current config to the original `defaultConfig` object
Expand Down
20 changes: 10 additions & 10 deletions docusaurus/docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ sidebar_position: 4

## Measurements

### `measurePerformance()` function {#measure-renders}
### `measureRenders()` function {#measure-renders}

:::info

Expand All @@ -20,17 +20,17 @@ measuring its performance and writing results to the output file. You can use op
of the testing.

```ts
async function measurePerformance(
async function measureRenders(
ui: React.ReactElement,
options?: MeasureOptions,
options?: MeasureRendersOptions,
): Promise<MeasureResults> {
```
#### Example {#measure-renders-example}
```ts
// sample.perf-test.tsx
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { screen, fireEvent } from '@testing-library/react-native';
import { ComponentUnderTest } from './ComponentUnderTest';

Expand All @@ -40,14 +40,14 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```
### `MeasureOptions` type {#measure-renders-options}
### `MeasureRendersOptions` type {#measure-renders-options}
```ts
interface MeasureOptions {
interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
wrapper?: React.ComponentType<{ children: ReactElement }>;
Expand Down Expand Up @@ -154,13 +154,13 @@ configure({
});
```
### `resetToDefault` function {#reset-to-defaults}
### `resetToDefaults` function {#reset-to-defaults}
```ts
resetToDefault(): void
resetToDefaults(): void
```
Reset current config to the original `defaultConfig` object. You can call `resetToDefault()` anywhere in your performance test file.
Reset current config to the original `defaultConfig` object. You can call `resetToDefaults()` anywhere in your performance test file.
### Environmental variables
Expand Down
14 changes: 7 additions & 7 deletions docusaurus/docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,10 @@ Now that the library is installed, you can write you first test scenario in a fi

```ts
// ComponentUnderTest.perf-test.tsx
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';

test('Simple test', async () => {
await measurePerformance(<ComponentUnderTest />);
await measureRenders(<ComponentUnderTest />);
});
```

Expand All @@ -60,7 +60,7 @@ This test will measure render times of `ComponentUnderTest` during mounting and
If your component contains any async logic or you want to test some interaction you should pass the `scenario` option:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { screen, fireEvent } from '@testing-library/react-native';

test('Test with scenario', async () => {
Expand All @@ -69,7 +69,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand All @@ -78,7 +78,7 @@ The body of the `scenario` function is using familiar React Native Testing Libra
In case of using a version of React Native Testing Library lower than v10.1.0, where [`screen` helper](https://callstack.github.io/react-native-testing-library/docs/api/#screen) is not available, the `scenario` function provides it as its first argument:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { fireEvent } from '@testing-library/react-native';

test('Test with scenario', async () => {
Expand All @@ -87,7 +87,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand Down Expand Up @@ -254,7 +254,7 @@ for performance tests you can add following override to your `.eslintrc` file:
rules: {
'jest/expect-expect': [
'error',
{ assertFunctionNames: ['expect', 'measurePerformance'] },
{ assertFunctionNames: ['expect', 'measureRenders'] },
],
}
```
18 changes: 9 additions & 9 deletions packages/measure/src/__tests__/measure-renders.test.tsx
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import * as React from 'react';
import { View } from 'react-native';
import stripAnsi from 'strip-ansi';
import { buildUiToRender, measurePerformance } from '../measure-renders';
import { buildUiToRender, measureRenders } from '../measure-renders';
import { resetHasShownFlagsOutput } from '../output';

const errorsToIgnore = ['❌ Measure code is running under incorrect Node.js configuration.'];
Expand All @@ -15,9 +15,9 @@ beforeEach(() => {
});
});

test('measurePerformance run test given number of times', async () => {
test('measureRenders run test given number of times', async () => {
const scenario = jest.fn(() => Promise.resolve(null));
const results = await measurePerformance(<View />, { runs: 20, scenario, writeFile: false });
const results = await measureRenders(<View />, { runs: 20, scenario, writeFile: false });
expect(results.runs).toBe(20);
expect(results.durations).toHaveLength(20);
expect(results.counts).toHaveLength(20);
Expand All @@ -28,9 +28,9 @@ test('measurePerformance run test given number of times', async () => {
expect(scenario).toHaveBeenCalledTimes(21);
});

test('measurePerformance applies "warmupRuns" option', async () => {
test('measureRenders applies "warmupRuns" option', async () => {
const scenario = jest.fn(() => Promise.resolve(null));
const results = await measurePerformance(<View />, { runs: 10, scenario, writeFile: false });
const results = await measureRenders(<View />, { runs: 10, scenario, writeFile: false });

expect(scenario).toHaveBeenCalledTimes(11);
expect(results.runs).toBe(10);
Expand All @@ -40,9 +40,9 @@ test('measurePerformance applies "warmupRuns" option', async () => {
expect(results.stdevCount).toBe(0);
});

test('measurePerformance should log error when running under incorrect node flags', async () => {
test('measureRenders should log error when running under incorrect node flags', async () => {
resetHasShownFlagsOutput();
const results = await measurePerformance(<View />, { runs: 1, writeFile: false });
const results = await measureRenders(<View />, { runs: 1, writeFile: false });

expect(results.runs).toBe(1);
const consoleErrorCalls = jest.mocked(realConsole.error).mock.calls;
Expand All @@ -57,8 +57,8 @@ function IgnoreChildren(_: React.PropsWithChildren<{}>) {
return <View />;
}

test('measurePerformance does not measure wrapper execution', async () => {
const results = await measurePerformance(<View />, { wrapper: IgnoreChildren, writeFile: false });
test('measureRenders does not measure wrapper execution', async () => {
const results = await measureRenders(<View />, { wrapper: IgnoreChildren, writeFile: false });
expect(results.runs).toBe(10);
expect(results.durations).toHaveLength(10);
expect(results.counts).toHaveLength(10);
Expand Down
2 changes: 1 addition & 1 deletion packages/measure/src/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,6 @@ export function configure(customConfig: Partial<Config>) {
};
}

export function resetToDefault() {
export function resetToDefaults() {
config = defaultConfig;
}
6 changes: 3 additions & 3 deletions packages/measure/src/index.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
export { configure, resetToDefault } from './config';
export { measurePerformance } from './measure-renders';
export { configure, resetToDefaults } from './config';
export { measureRenders, measurePerformance } from './measure-renders';
export { measureFunction } from './measure-function';
export type { MeasureOptions } from './measure-renders';
export type { MeasureRendersOptions } from './measure-renders';
export type { MeasureFunctionOptions } from './measure-function';
25 changes: 21 additions & 4 deletions packages/measure/src/measure-renders.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -11,16 +11,16 @@ logger.configure({
silent: process.env.REASSURE_SILENT === 'true' || process.env.REASSURE_SILENT === '1',
});

export interface MeasureOptions {
export interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
wrapper?: React.ComponentType<{ children: React.ReactElement }>;
scenario?: (screen: any) => Promise<any>;
writeFile?: boolean;
}

export async function measurePerformance(ui: React.ReactElement, options?: MeasureOptions): Promise<MeasureResults> {
const stats = await measureRender(ui, options);
export async function measureRenders(ui: React.ReactElement, options?: MeasureRendersOptions): Promise<MeasureResults> {
const stats = await measureRendersInternal(ui, options);

if (options?.writeFile !== false) {
await writeTestStats(stats, 'render');
Expand All @@ -29,7 +29,24 @@ export async function measurePerformance(ui: React.ReactElement, options?: Measu
return stats;
}

export async function measureRender(ui: React.ReactElement, options?: MeasureOptions): Promise<MeasureResults> {
/**
* @deprecated The `measurePerformance` function has been renamed to `measureRenders`. The `measurePerformance` alias is now deprecated and will be removed in future releases.
*/
export async function measurePerformance(
ui: React.ReactElement,
options?: MeasureRendersOptions
): Promise<MeasureResults> {
logger.warnOnce(
'The `measurePerformance` function has been renamed to `measureRenders`.\n\nThe `measurePerformance` alias is now deprecated and will be removed in future releases.'
);

return await measureRenders(ui, options);
}

async function measureRendersInternal(
ui: React.ReactElement,
options?: MeasureRendersOptions
): Promise<MeasureResults> {
const runs = options?.runs ?? config.runs;
const scenario = options?.scenario;
const warmupRuns = options?.warmupRuns ?? config.warmupRuns;
Expand Down
Loading

0 comments on commit 84abb74

Please sign in to comment.