Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: delete user attributes #7342

Merged

Conversation

hideokamoto
Copy link
Contributor

Issue #, if available:

Ref: Expose promisifed helper to delete attributes on Auth #3119
Currently, there is no method to delete the user attributes in the Auth class.

Description of changes:

I've added the method into the class.

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@codecov
Copy link

codecov bot commented Dec 4, 2020

Codecov Report

Merging #7342 (4e453ea) into main (7c8eb92) will decrease coverage by 0.04%.
The diff coverage is 11.11%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #7342      +/-   ##
==========================================
- Coverage   74.26%   74.22%   -0.05%     
==========================================
  Files         215      215              
  Lines       13470    13479       +9     
  Branches     2645     2647       +2     
==========================================
+ Hits        10004    10005       +1     
- Misses       3268     3276       +8     
  Partials      198      198              
Impacted Files Coverage Δ
packages/auth/src/Auth.ts 87.38% <11.11%> (-0.80%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7c8eb92...4e453ea. Read the comment docs.

@codecov-commenter
Copy link

codecov-commenter commented Jun 7, 2021

Codecov Report

Merging #7342 (bcd70b9) into main (b16f8e7) will decrease coverage by 0.00%.
The diff coverage is 90.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #7342      +/-   ##
==========================================
- Coverage   77.80%   77.80%   -0.01%     
==========================================
  Files         240      240              
  Lines       17121    17131      +10     
  Branches     3650     3651       +1     
==========================================
+ Hits        13321    13328       +7     
- Misses       3675     3678       +3     
  Partials      125      125              
Impacted Files Coverage Δ
packages/auth/src/Auth.ts 87.38% <90.00%> (+0.02%) ⬆️
...kages/amazon-cognito-identity-js/src/BigInteger.js 89.31% <0.00%> (-0.41%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b16f8e7...bcd70b9. Read the comment docs.

Copy link
Contributor

@hkjpotato hkjpotato left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for submitting the PR!

public deleteUserAttributes(
user: CognitoUser | any,
attributeNames: string[],
clientMetadata: ClientMetaData = this._config.clientMetadata
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you remove the clientMetadata, as it is not accepted by the deleteAttributes function. Behind the scene, Coginto.DeleteUserAttributes does not respect this input.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the feedback!
I've updated the code to delete it :)

@hkjpotato
Copy link
Contributor

since this PR #8578 has been merged, I am good with the new deleteUserAttributes API on Auth class.

@hideokamoto can you add unit tests for the new deleteUserAttributes method, like the sibling updateUserAttributes method. Basically,

  • You need to mock the deleteAttributes call on CognitoUser, like this.
  • Then add unit tests for the Auth deleteUserAttributes method, like this.

@hideokamoto
Copy link
Contributor Author

hideokamoto commented Jul 19, 2021

The test command could not execute.

% yarn run test --scope @aws-amplify/auth
yarn run v1.22.10
$ lerna run test --stream --scope @aws-amplify/auth
lerna notice cli v3.22.1
lerna info versioning independent
lerna notice filter including "@aws-amplify/auth"
lerna info filter [ '@aws-amplify/auth' ]
lerna info Executing command in 1 package: "yarn run test"
@aws-amplify/auth: $ yarn lint --fix && jest -w 1 --coverage
@aws-amplify/auth: $ tslint '{__tests__,src}/**/*.ts' --fix
@aws-amplify/auth: ts-jest[config] (WARN) TypeScript diagnostics (customize using `[jest-config].globals.ts-jest.diagnostics` option):
@aws-amplify/auth: message TS151001: If you have issues related to imports, you should consider setting `esModuleInterop` to `true` in your TypeScript configuration file (usually `tsconfig.json`). See https://blogs.msdn.microsoft.com/typescript/2018/01/31/announcing-typescript-2-7/#easier-ecmascript-module-interoperability for more information.
@aws-amplify/auth: <--- Last few GCs --->
@aws-amplify/auth: [17977:0x108008000]   102392 ms: Mark-sweep (reduce) 4093.0 (4102.3) -> 4092.3 (4103.3) MB, 4721.1 / 0.0 ms  (average mu = 0.120, current mu = 0.003) allocation failure scavenge might not succeed
@aws-amplify/auth: [17977:0x108008000]   107467 ms: Mark-sweep (reduce) 4093.3 (4105.3) -> 4092.8 (4105.8) MB, 5066.0 / 0.0 ms  (average mu = 0.060, current mu = 0.002) allocation failure scavenge might not succeed
@aws-amplify/auth: <--- JS stacktrace --->
@aws-amplify/auth: FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
@aws-amplify/auth:  1: 0x10130d5e5 node::Abort() (.cold.1) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  2: 0x1000b2289 node::Abort() [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  3: 0x1000b23ef node::OnFatalError(char const*, char const*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  4: 0x1001f68c7 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  5: 0x1001f6863 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  6: 0x1003a47e5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  7: 0x1003a628a v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  8: 0x1003a19b5 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth:  9: 0x10039f2e0 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 10: 0x1003ad9ea v8::internal::Heap::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 11: 0x1003ada71 v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 12: 0x100374fc7 v8::internal::FactoryBase<v8::internal::Factory>::NewFixedArrayWithFiller(v8::internal::Handle<v8::internal::Map>, int, v8::internal::Handle<v8::internal::Oddball>, v8::internal::AllocationType) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 13: 0x10037c643 v8::internal::Factory::NewFrameArray(int) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 14: 0x1003416a3 v8::internal::(anonymous namespace)::CaptureStackTrace(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Object>, v8::internal::(anonymous namespace)::CaptureStackTraceOptions) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 15: 0x100341e61 v8::internal::Isolate::CaptureAndSetSimpleStackTrace(v8::internal::Handle<v8::internal::JSReceiver>, v8::internal::FrameSkipMode, v8::internal::Handle<v8::internal::Object>) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 16: 0x10028a610 v8::internal::Builtin_Impl_ErrorCaptureStackTrace(v8::internal::BuiltinArguments, v8::internal::Isolate*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 17: 0x100a82099 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_BuiltinExit [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: 18: 0x100a1ae42 Builtins_InterpreterEntryTrampoline [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
@aws-amplify/auth: /bin/sh: line 1: 17977 Abort trap: 6           jest -w 1 --coverage
@aws-amplify/auth: error Command failed with exit code 134.
@aws-amplify/auth: info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
lerna ERR! yarn run test exited 134 in '@aws-amplify/auth'

@hideokamoto
Copy link
Contributor Author

hideokamoto commented Jul 19, 2021

@hkjpotato

I can't run the test code in my local.
No way to resolve the HEAP Error.

How can we resolve it?

% node --trace-gc --max-old-space-size=2056   ./node_modules/.bin/lerna run test --scope @aws-amplify/auth
[8922:0x108008000]       56 ms: Scavenge 2.5 (3.0) -> 2.1 (4.0) MB, 1.2 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]       87 ms: Scavenge 2.7 (4.3) -> 2.4 (5.3) MB, 1.0 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      108 ms: Scavenge 3.5 (7.6) -> 2.9 (8.1) MB, 0.7 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      160 ms: Scavenge 5.0 (8.1) -> 3.8 (8.3) MB, 0.8 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      194 ms: Scavenge 5.6 (8.3) -> 4.7 (9.1) MB, 1.0 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      229 ms: Scavenge 6.9 (13.6) -> 5.8 (14.6) MB, 1.0 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      310 ms: Scavenge 10.9 (15.6) -> 8.1 (16.9) MB, 1.1 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      373 ms: Scavenge 11.7 (16.9) -> 9.3 (17.6) MB, 1.4 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      439 ms: Scavenge 13.1 (17.6) -> 10.7 (19.1) MB, 1.1 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[8922:0x108008000]      502 ms: Scavenge 14.4 (27.4) -> 11.8 (28.6) MB, 1.3 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
lerna notice cli v3.22.1
lerna info versioning independent
[8922:0x108008000]      573 ms: Mark-sweep 17.1 (28.6) -> 9.5 (28.7) MB, 1.9 / 0.1 ms  (+ 2.0 ms in 19 steps since start of marking, biggest step 0.2 ms, walltime since start of marking 11 ms) (average mu = 1.000, current mu = 1.000) finalize incremental marking via task GC in old space requested
lerna notice filter including "@aws-amplify/auth"
lerna info filter [ '@aws-amplify/auth' ]
lerna info Executing command in 1 package: "yarn run test"
[8922:0x108008000]     8678 ms: Mark-sweep (reduce) 16.6 (29.0) -> 10.0 (27.7) MB, 7.0 / 0.0 ms  (+ 7.6 ms in 64 steps since start of marking, biggest step 0.3 ms, walltime since start of marking 16 ms) (average mu = 0.998, current mu = 0.998) finalize incremental marking via task GC in old space requested
[8922:0x108008000]     9301 ms: Mark-sweep (reduce) 10.0 (15.7) -> 10.0 (15.0) MB, 8.8 / 0.0 ms  (+ 7.3 ms in 81 steps since start of marking, biggest step 0.3 ms, walltime since start of marking 17 ms) (average mu = 0.996, current mu = 0.974) finalize incremental marking via task GC in old space requested
lerna ERR! yarn run test exited 134 in '@aws-amplify/auth'
lerna ERR! yarn run test stdout:
yarn run v1.22.10
$ yarn lint && jest -w 1 --coverage
$ tslint '{__tests__,src}/**/*.ts'
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.

lerna ERR! yarn run test stderr:
ts-jest[config] (WARN) TypeScript diagnostics (customize using `[jest-config].globals.ts-jest.diagnostics` option):
message TS151001: If you have issues related to imports, you should consider setting `esModuleInterop` to `true` in your TypeScript configuration file (usually `tsconfig.json`). See https://blogs.msdn.microsoft.com/typescript/2018/01/31/announcing-typescript-2-7/#easier-ecmascript-module-interoperability for more information.

<--- Last few GCs --->

[8969:0x110008000]    86872 ms: Mark-sweep (reduce) 4093.7 (4103.0) -> 4092.9 (4104.2) MB, 3504.1 / 0.0 ms  (average mu = 0.072, current mu = 0.001) allocation failure scavenge might not succeed
[8969:0x110008000]    90284 ms: Mark-sweep (reduce) 4093.9 (4103.2) -> 4093.1 (4104.5) MB, 3405.4 / 0.0 ms  (average mu = 0.038, current mu = 0.002) allocation failure scavenge might not succeed


<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x10130d5e5 node::Abort() (.cold.1) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 2: 0x1000b2289 node::Abort() [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 3: 0x1000b23ef node::OnFatalError(char const*, char const*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 4: 0x1001f68c7 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 5: 0x1001f6863 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 6: 0x1003a47e5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 7: 0x1003a628a v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 8: 0x1003a19b5 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 9: 0x10039f2e0 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
10: 0x1003ad9ea v8::internal::Heap::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
11: 0x1003ada71 v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
12: 0x10037b862 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
13: 0x1006fbb38 v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
14: 0x100a81fb9 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
/bin/sh: line 1:  8969 Abort trap: 6           jest -w 1 --coverage
error Command failed with exit code 134.

lerna ERR! yarn run test exited 134 in '@aws-amplify/auth'
node --trace-gc --max-old-space-size=20240   ./node_modules/.bin/lerna run test --scope @aws-amplify/auth

lerna ERR! yarn run test stderr:
ts-jest[config] (WARN) TypeScript diagnostics (customize using `[jest-config].globals.ts-jest.diagnostics` option):
message TS151001: If you have issues related to imports, you should consider setting `esModuleInterop` to `true` in your TypeScript configuration file (usually `tsconfig.json`). See https://blogs.msdn.microsoft.com/typescript/2018/01/31/announcing-typescript-2-7/#easier-ecmascript-module-interoperability for more information.

<--- Last few GCs --->

[19730:0x108008000]    99319 ms: Mark-sweep (reduce) 4095.0 (4106.7) -> 4094.9 (4107.7) MB, 4106.4 / 0.0 ms  (+ 87.8 ms in 16 steps since start of marking, biggest step 15.1 ms, walltime since start of marking 4205 ms) (average mu = 0.071, current mu = 0.[19730:0x108008000]   103623 ms: Mark-sweep (reduce) 4095.9 (4104.7) -> 4095.5 (4106.5) MB, 4268.8 / 0.0 ms  (+ 24.8 ms in 16 steps since start of marking, biggest step 13.6 ms, walltime since start of marking 4305 ms) (average mu = 0.036, current mu = 0.

<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x10130d5e5 node::Abort() (.cold.1) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 2: 0x1000b2289 node::Abort() [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 3: 0x1000b23ef node::OnFatalError(char const*, char const*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 4: 0x1001f68c7 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 5: 0x1001f6863 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 6: 0x1003a47e5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 7: 0x1003a628a v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 8: 0x1003a19b5 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
 9: 0x10039f2e0 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
10: 0x1003ad9ea v8::internal::Heap::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
11: 0x1003ada71 v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
12: 0x10037b862 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
13: 0x1006fbb38 v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
14: 0x100a81fb9 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/Users/okamotohidetaka/.anyenv/envs/nodenv/versions/14.17.1/bin/node]
/bin/sh: line 1: 19730 Abort trap: 6           jest -w 1 --coverage
error Command failed with exit code 134.

lerna ERR! yarn run test exited 134 in '@aws-amplify/auth'

@hkjpotato
Copy link
Contributor

hkjpotato commented Jul 19, 2021

@hideokamoto thank you for adding the test, nice work! Your test actually passes when run by the ci/circleci, I also checkout your change locally and verify it passes.

I haven't seen the heap error you show above, have you followed https://github.com/aws-amplify/amplify-js/blob/main/CONTRIBUTING.md#steps-towards-contributions to run yarn run test --scope @aws-amplify/auth?

I will check with my teammate to see if we can merge this PR now, I might need add some doc change as well.

Copy link
Contributor

@elorzafe elorzafe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @hideokamoto 🌮 🎉 🥇

@hkjpotato
Copy link
Contributor

doc update PR: aws-amplify/docs#3413

@hkjpotato hkjpotato merged commit 1b1df67 into aws-amplify:main Jul 23, 2021
@hideokamoto hideokamoto deleted the feat/auth/delete-user-attributes branch July 24, 2021 05:42
@github-actions
Copy link

This pull request has been automatically locked since there hasn't been any recent activity after it was closed. Please open a new issue for related bugs.

Looking for a help forum? We recommend joining the Amplify Community Discord server *-help channels or Discussions for those types of questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jul 25, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants