- Wechaty Token
You can apply a Windows / Pad protocol token from our puppet service providers:
Copy the following shell script and then paste it into the term of your server, to setup your Wechaty token:
# learn how to DIY a Wechaty Puppet Service token at http://wechaty.js.org/docs/puppet-services/diy
export WECHATY_TOKEN=insecure_wechaty_puppet_service_token_diy
# Set port for your hostie service: must be published accessible on the internet
# Wechaty IO Client use this port to publish the Puppet Service
export WECHATY_PUPPET_SERVER_PORT=48788
# learn more about Wechaty Puppet PadLocal at https://wechaty.js.org/docs/puppet-services/padlocal
export WECHATY_PUPPET=wechaty-puppet-padlocal
# get a 7 days free token at PadLocal official website: http://pad-local.com/
export WECHATY_PUPPET_PADLOCAL_TOKEN=YOUR_PADLOCAL_TOKEN_AT_HERE
export WECHATY_LOG=verbose
docker run \
--rm \
-ti \
-e WECHATY_LOG \
-e WECHATY_PUPPET \
-e WECHATY_PUPPET_PADLOCAL_TOKEN \
-e WECHATY_PUPPET_SERVER_PORT \
-e WECHATY_TOKEN \
-p "$WECHATY_PUPPET_SERVER_PORT" \
wechaty/wechaty:0.78
Learn more: Puppet Service: DIY This guide will help you generate a Wechaty Token for connecting to the Wechaty Puppet Service.
We have four steps in our live coding, they are saved in four separate branches for easy loading and testing.
Branch: ng_china_2020_step_1_ng_new_my-app
npx --package @angular/cli ng new my-app
cd my-app
ng serve --open
Learn more from https://angular.io/guide/setup-local
Branch: ng_china_2020_step_2_wechaty
npm i @chatie/angular brolog
import { WechatyModule } from '@chatie/angular'
@NgModule({
imports: [
WechatyModule,
...
],
...
<wechaty
#wechaty
token="insecure_wechaty_puppet_service_token_diy"
(heartbeat) = "onHeartbeat($event)"
(scan) = "onScan($event)"
(login) = "wechaty.startSyncMessage(); onLogin($event)"
(message) = "onMessage($event)"
>
</wechaty>
Branch: ng_china_2020_step_3_toxicity
npm install @tensorflow/tfjs
npm install @tensorflow-models/toxicity
ng generate service toxicity
Learn more:
- TensorFlow.js models: toxicity classifier source
- TensorFlow.js toxicity classifier demo: This is a demo of the TensorFlow.js toxicity model, which classifies text according to whether it exhibits offensive attributes (i.e. profanity, sexual explicitness).
- Text classification using TensorFlow.js: An example of detecting offensive language in browser
The traffic light code is copy/pasted from this great tutorial: Stop in the Name of the Traffic Light
To be written.
Branch: step_4_tensorflow-models_qna
npm install @tensorflow-models/qna
// to be written
Learn more:
- TensorFlow.js models: Question and Answer source Use a pre-trained model to answer questions based on the content of a given passage.
- TensorFlow.js models: Question and Answer demo
- TensorFlow Blog: Exploring helpful uses for BERT in your browser with Tensorflow.js
1. Ng+ Developers Conference 2020 Keynote: Conversational AI, Chatbot, and Angular, Huan, Nov 21, 2020
November 21 - 22 @online
Knowledge, ideas, and insights for the Next Generation
- ngChina 2020: https://ng-plus.dev
- ngChina 2019: https://ng-china.org
Google Slides https://docs.google.com/presentation/d/1Gd3D8bS6OifXDsdSe0x5i6XsP_uISX3W9tR8yBA0mYs/edit?usp=sharing
Talk Video: https://youtu.be/SACugbTNQnc
- TensorFlow.js Tutorials
- TensorFlow.js Models
- TensorFlow.js Demos
- TensorFlow.js Examples
- TensorFlow.js Gallery
Huan LI (李卓桓), Google Machine Learning Developer Expert, zixia@zixia.net
- Docs released under Creative Commons
- Code released under the Apache-2.0 License
- Code & Docs © 2020-2021 Huan LI <zixia@zixia.net>