Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

test: add AudioRecorder tests #2373

Merged
merged 1 commit into from
Apr 30, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions src/components/Attachment/hooks/useAudioController.ts
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@ export const useAudioController = ({
if (!audioRef.current) return;
try {
audioRef.current.pause();
setIsPlaying(false);
} catch (e) {
registerError(new Error(t('Failed to play the recording')));
}
Expand Down
25 changes: 18 additions & 7 deletions src/components/MediaRecorder/AudioRecorder/AudioRecorder.tsx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import React from 'react';
import React, { useMemo } from 'react';
import { AudioRecordingPreview } from './AudioRecordingPreview';
import { AudioRecordingInProgress } from './AudioRecordingInProgress';
import { MediaRecordingState } from '../classes';
Expand All @@ -19,47 +19,58 @@ export const AudioRecorder = () => {

const isUploadingFile = recording?.$internal?.uploadState === 'uploading';

const state = useMemo(
() => ({
paused: recordingState === MediaRecordingState.PAUSED,
recording: recordingState === MediaRecordingState.RECORDING,
stopped: recordingState === MediaRecordingState.STOPPED,
}),
[recordingState],
);

Comment on lines +22 to +29
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there is no need to memoize this. Inlining these values should be simpler:

const paused = recordingState === MediaRecordingState.PAUSED;
...

if (!recorder) return null;

return (
<div className='str-chat__audio_recorder-container'>
<div className='str-chat__audio_recorder' data-testid={'audio-recorder'}>
<button
className='str-chat__audio_recorder__cancel-button'
data-testid={'cancel-recording-audio-button'}
disabled={isUploadingFile}
onClick={recorder.cancel}
>
<BinIcon />
</button>

{recording?.asset_url ? (
{state.stopped && recording?.asset_url ? (
<AudioRecordingPreview
durationSeconds={recording.duration ?? 0}
mimeType={recording.mime_type}
src={recording.asset_url}
waveformData={recording.waveform_data}
/>
) : (
) : state.paused || state.recording ? (
<AudioRecordingInProgress />
)}
) : null}

{recordingState === MediaRecordingState.PAUSED && (
{state.paused && (
<button
className='str-chat__audio_recorder__resume-recording-button'
onClick={recorder.resume}
>
<MicIcon />
</button>
)}
{recordingState === MediaRecordingState.RECORDING && (
{state.recording && (
<button
className='str-chat__audio_recorder__pause-recording-button'
data-testid={'pause-recording-audio-button'}
onClick={recorder.pause}
>
<PauseIcon />
</button>
)}
{recordingState === MediaRecordingState.STOPPED ? (
{state.stopped ? (
<button
className='str-chat__audio_recorder__complete-button'
data-testid='audio-recorder-complete-button'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,20 +25,22 @@
if (!recorder) return null;

return (
<div className='str-chat__audio_recorder__waveform-box'>
{amplitudes.slice(-maxDataPointsDrawn).map((amplitude, i) => (
<div
className='str-chat__wave-progress-bar__amplitude-bar'
key={`amplitude-${i}-voice-recording`}
style={
{
'--str-chat__wave-progress-bar__amplitude-bar-height': amplitude
? amplitude * 100 + '%'
: '0%',
} as React.CSSProperties
}
/>
))}
<div className='str-chat__waveform-box-container'>
<div className='str-chat__audio_recorder__waveform-box'>
{amplitudes.slice(-maxDataPointsDrawn).map((amplitude, i) => (

Check warning on line 30 in src/components/MediaRecorder/AudioRecorder/AudioRecordingInProgress.tsx

View check run for this annotation

Codecov / codecov/patch

src/components/MediaRecorder/AudioRecorder/AudioRecordingInProgress.tsx#L30

Added line #L30 was not covered by tests
<div
className='str-chat__wave-progress-bar__amplitude-bar'
key={`amplitude-${i}-voice-recording`}
style={
{
'--str-chat__wave-progress-bar__amplitude-bar-height': amplitude
? amplitude * 100 + '%'
: '0%',
} as React.CSSProperties
}
/>
))}
</div>
</div>
);
};
Expand All @@ -64,12 +66,11 @@
mediaRecorder.removeEventListener('pause', stopCounter);
};
}, [recorder, startCounter, stopCounter]);

return (
<React.Fragment>
<RecordingTimer durationSeconds={secondsElapsed} />
<div className='str-chat__waveform-box-container'>
<AudioRecordingWaveform />
</div>
<AudioRecordingWaveform />
</React.Fragment>
);
};
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,11 @@ export const AudioRecordingPreview = ({
<audio ref={audioRef}>
<source src={props.src} type={mimeType} />
</audio>
<button className='str-chat__audio_recorder__toggle-playback-button' onClick={togglePlay}>
<button
className='str-chat__audio_recorder__toggle-playback-button'
data-testid='audio-recording-preview-toggle-play-btn'
onClick={togglePlay}
>
{isPlaying ? <PauseIcon /> : <PlayIcon />}
</button>
<RecordingTimer durationSeconds={displayedDuration} />
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import React, { useEffect } from 'react';
import { act, fireEvent, render, screen } from '@testing-library/react';
import { act, fireEvent, render, screen, waitFor } from '@testing-library/react';
import '@testing-library/jest-dom';
import * as transcoder from '../../transcode';

Expand All @@ -9,6 +9,7 @@ import {
ChannelStateProvider,
ChatProvider,
ComponentProvider,
MessageInputContextProvider,
useMessageInputContext,
} from '../../../../context';
import {
Expand All @@ -28,13 +29,17 @@ import {
MediaRecorderMock,
} from '../../../../mock-builders/browser';
import { generateDataavailableEvent } from '../../../../mock-builders/browser/events/dataavailable';
import { AudioRecorder } from '../AudioRecorder';
import { MediaRecordingState } from '../../classes';

const PERM_DENIED_NOTIFICATION_TEXT =
'To start recording, allow the microphone access in your browser';

const START_RECORDING_AUDIO_BUTTON_TEST_ID = 'start-recording-audio-button';
const AUDIO_RECORDER_TEST_ID = 'audio-recorder';
const CANCEL_RECORDING_AUDIO_BUTTON_TEST_ID = 'cancel-recording-audio-button';
const PAUSE_RECORDING_AUDIO_BUTTON_TEST_ID = 'pause-recording-audio-button';
const AUDIO_RECORDER_STOP_BTN_TEST_ID = 'audio-recorder-stop-button';
const AUDIO_RECORDER_TEST_ID = 'audio-recorder';
const AUDIO_RECORDER_COMPLETE_BTN_TEST_ID = 'audio-recorder-complete-button';

const CSS_THEME_VERSION = '2';
Expand Down Expand Up @@ -62,8 +67,8 @@ const renderComponent = async ({
client,
} = await initClientWithChannels();
let result;
await act(() => {
result = render(
await act(async () => {
result = await render(
<ChatProvider
value={{
...{ client, ...DEFAULT_RENDER_PARAMS.chatCtx, ...chatCtx },
Expand Down Expand Up @@ -94,6 +99,8 @@ jest.mock('nanoid', () => ({

jest.mock('fix-webm-duration', () => jest.fn((blob) => blob));

jest.spyOn(console, 'warn').mockImplementation();

jest
.spyOn(transcoder, 'transcode')
.mockImplementation((opts) =>
Expand Down Expand Up @@ -189,6 +196,44 @@ describe('MessageInput', () => {
expect(screen.queryByTestId(AUDIO_RECORDER_TEST_ID)).toBeInTheDocument();
});

it.each([MediaRecordingState.PAUSED, MediaRecordingState.RECORDING, MediaRecordingState.STOPPED])(
'renders message composer when recording cancelled while recording in state %s',
async (state) => {
const { container } = await renderComponent();
const Input = () => container.querySelector('.str-chat__message-input');
await waitFor(() => {
expect(Input()).toBeInTheDocument();
});

await act(() => {
fireEvent.click(screen.queryByTestId(START_RECORDING_AUDIO_BUTTON_TEST_ID));
});
await waitFor(() => {
expect(Input()).not.toBeInTheDocument();
});

if (state === MediaRecordingState.PAUSED) {
await act(() => {
fireEvent.click(screen.queryByTestId(PAUSE_RECORDING_AUDIO_BUTTON_TEST_ID));
});
} else if (state === MediaRecordingState.STOPPED) {
await act(() => {
fireEvent.click(screen.queryByTestId(AUDIO_RECORDER_STOP_BTN_TEST_ID));
});
}
await waitFor(() => {
expect(Input()).not.toBeInTheDocument();
});

await act(() => {
fireEvent.click(screen.queryByTestId(CANCEL_RECORDING_AUDIO_BUTTON_TEST_ID));
});
await waitFor(() => {
expect(Input()).toBeInTheDocument();
});
},
);

it('does not show RecordingPermissionDeniedNotification until start recording button clicked if microphone permission is denied', async () => {
expect(screen.queryByText(PERM_DENIED_NOTIFICATION_TEXT)).not.toBeInTheDocument();
const status = new EventEmitterMock();
Expand Down Expand Up @@ -309,14 +354,75 @@ describe('MessageInput', () => {
expect(sendMessage).not.toHaveBeenCalled();
});
});

const recorderMock = {};

const DEFAULT_RECORDING_CONTROLLER = {
completeRecording: jest.fn(),
recorder: recorderMock,
recording: undefined,
recordingState: undefined,
};

const renderAudioRecorder = (controller = {}) =>
render(
<ChannelActionProvider value={{}}>
<MessageInputContextProvider
value={{ recordingController: { ...DEFAULT_RECORDING_CONTROLLER, ...controller } }}
>
<AudioRecorder />
</MessageInputContextProvider>
</ChannelActionProvider>,
);

describe('AudioRecorder', () => {
it.todo('does not render anything if recorder is not available');
it.todo('renders audio recording in progress UI');
it.todo('renders audio recording paused UI when paused');
it.todo('renders audio recording in progress UI when recording resumed');
it.todo('renders audio recording stopped UI when stopped');
it.todo('renders message composer when recording cancelled while recording');
it.todo('renders message composer when recording cancelled while paused');
it.todo('renders message composer when recording cancelled while stopped');
it.todo('renders loading indicators while recording being uploaded');
it('does not render anything if recorder is not available', async () => {
const { container } = await renderAudioRecorder({ recorder: undefined });
expect(container).toBeEmpty();
});

it('renders audio recording in progress UI', async () => {
const { container } = await renderAudioRecorder({
recordingState: MediaRecordingState.RECORDING,
});
expect(container).toMatchSnapshot();
});
it('renders audio recording paused UI when paused', async () => {
const { container } = await renderAudioRecorder({
recordingState: MediaRecordingState.PAUSED,
});
expect(container).toMatchSnapshot();
});
it('renders audio recording stopped UI when stopped without recording preview', async () => {
const { container } = await renderAudioRecorder({
recordingState: MediaRecordingState.STOPPED,
});
expect(container).toMatchSnapshot();
});
it('renders audio recording stopped UI with recording preview', async () => {
const { container } = await renderAudioRecorder({
recording: generateVoiceRecordingAttachment(),
recordingState: MediaRecordingState.STOPPED,
});
expect(container).toMatchSnapshot();
});

it.each([MediaRecordingState.PAUSED, MediaRecordingState.RECORDING])(
'does not render recording preview if %s',
async (state) => {
const { container } = await renderAudioRecorder({
recording: generateVoiceRecordingAttachment(),
recordingState: state,
});
expect(container).toMatchSnapshot();
},
);

it('renders loading indicators while recording being uploaded', async () => {
await renderAudioRecorder({
recording: generateVoiceRecordingAttachment({ $internal: { uploadState: 'uploading' } }),
recordingState: MediaRecordingState.STOPPED,
});
expect(screen.queryByTestId('loading-indicator')).toBeInTheDocument();
});
});
Loading
Loading