\n
\n <...' - ); - $templateCache.put( - 'home/views/home.footer.html', - '
\n
\n
B...' - ); - $templateCache.put( - 'home/views/home.header.html', - '
\n \n
\n
hd-worker-a -2. hd-worker-a -> elementary-os -3. elementary-os -> hd-worker-b -4. hd-worker-b -> elementary-os - -### Basic Environment Setup - -在修改 Hadoop 的配置之前,需要进行配置的是所有节点的环境变量设置与必要的基础程序. - -#### JDK - -Hadoop 运行在 Java 环境中,所以每个节点都需要安装 JDK. 需要保证的是确保每一台节点上安装的 JDK 版本一致. P.S 我自己是 Master OpenJDK-8 + Slaves OpenJDK-7. 目前还是正常运行的 (顺便吐槽一下 `Ubuntu14.04`默认的 apt-get 源,相当傻逼.在不添加自己订阅的其他源的情况下连 OpenJDK8 的地址都没有,而且如果安装 Git 之类的工具,为求稳定居然用的是 1.7 以下的版本. 这也是为什么我日常开发用的是`elementary-os`,虽然也是基于 ubuntu14 的内核, 但是 elementary-os 修改了其默认的 apt 源,ui 看起来也更加顺眼) - -```bash -$ sudo apt-get install openjdk-7-jdk -``` - -通过此举,安装的默认的 jdk 路径是`/usr/lib/jvm/java-7-openjdk-amd64`. OpenJDK8 同理. OracleJDK 也推荐复制到`/usr/lib/jvm`目录下.(守序善良 Linux 派优雅的约定之一) - -记住这里咯.在下面我们会将这个 JDK 的目录,加到当前用户`hduser`的`.bashrc`中. - -### Configure Hadoop - -终于到了这一步. 建议首先在 Master 上机器修改好 Hadoop 的配置.然后压缩该文件夹,复制到其他 Slave 节点上的同一目录. - -#### Unpack and move hadoop folder - -假设下载好的 hadoop-2.7.2.tar.gz 在 当前用户的`Downloads`文件夹中. 解压完毕之后,将其移动到`/usr/local`下,并更名为`hadoop` - -``` -$ mv hadoop-2.7.2 /usr/local/hadoop -``` - -#### Update Environment File - -在配置 Hadoop 的过程中,下列配置文件将会被修改. - -> ~/.bashrc /usr/local/hadoop/etc/hadoop/slaves /usr/local/hadoop/etc/hadoop/hadoop-env.sh /usr/local/hadoop/etc/hadoop/core-site.xml /usr/local/hadoop/etc/hadoop/yarn-site.xml /usr/local/hadoop/etc/hadoop/mapred-site.xml /usr/local/hadoop/etc/hadoop/hdfs-site.xml - -##### ~/.bashrc - -还记得之前提过的 JDK 路径吗,将其配置成`JAVA_HOME` 修改当前用户的 bash 配置文件,将其加到.bashrc 的底部 - -```bash -$ cd ~ -$ vi .bashrc -``` - -```sh -#Hadoop variables -export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64/ -export HADOOP_INSTALL=/usr/local/hadoop -export PATH=$PATH:$HADOOP_INSTALL/bin -export PATH=$PATH:$HADOOP_INSTALL/sbin -export HADOOP_MAPRED_HOME=$HADOOP_INSTALL -export HADOOP_COMMON_HOME=$HADOOP_INSTALL -export HADOOP_HDFS_HOME=$HADOOP_INSTALL -export YARN_HOME=$HADOOP_INSTALL -``` - -##### /usr/local/hadoop/etc/hadoop/hadoop-env.sh - -还是跟上面一样,需要将 JDK 的路径设置成`JAVA_HOME` - -``` -export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64/ -``` - -##### /usr/local/hadoop/etc/hadoop/core-site.xml - -在``之间添加一个 fs.default.name,其值为 master 机器的 9000 端口. 譬如我的 master 机器是`elementary-os`,则 value 是`hdfs://elementary-os:9000` P.S.接下来的变量`{master-hostname}`请自行替换成自己的 master 的机器名. - -```xml - - - fs.default.name - hdfs://{master-hostname}:9000 - - - hadoop.tmp.dir - file:/usr/local/hadoop_store/tmp - - -``` - -#### /usr/local/hadoop/etc/hadoop/yarn-site.xml - -在``之间添加: - -```xml - - - yarn.nodemanager.aux-services - mapreduce_shuffle - - - yarn.nodemanager.aux-services.mapreduce.shuffle.class - org.apache.hadoop.mapred.ShuffleHandler - - - yarn.resourcemanager.hostname - {master-hostname} - - - -``` - -##### /usr/local/hadoop/etc/hadoop/mapred-site.xml - -`mapred-site.xml`默认是不存在的. 但是有一份模板文件`mapred-site.xml.template`,我们将其复制并重命名成`mapred-site.xml` - -```bash -$ cp /usr/local/hadoop/etc/hadoop/mapred-site.xml.template /usr/local/hadoop/etc/hadoop/mapred-site.xml -``` - -在``之间添加: - -```xml - - - mapreduce.framework.name - yarn - - - mapred.job.tracker - {master-hostname}:9001 - - - mapreduce.jobhistory.address - {master-hostname}:10020 - - - mapreduce.jobhistory.webapp.address - {master-hostname}:19888 - - -``` - -##### /usr/local/hadoop/etc/hadoop/hdfs-site.xml - -在修改`hdfs-site.xml`这个配置文件之前,我们需要知道更多的一件事. hdfs 的块状文件,储存在一个指定的目录中. 按照官方文档的推荐,和网上一些文件夹的路径的约定,我们将这个 hdfs 的文件储存目录叫做`hadoop_store`.绝对路径为`/usr/local/hadoop_store` - -于是 hadoop 的相关文件夹就变成了两个: - -> /usr/local/hadoop /usr/local/hadoop_store - -由于读写权限问题,我们需要将`hadoop_store`的权限改成任意可读可写 - -```bash -$ sudo mkdir -p /usr/local/hadoop_store -$ sudo chmod -R 777 /usr/local/hadoop_store -``` - -然后再在配置文件里面加入 - -```xml - - - dfs.namenode.secondary.http-address - {master-hostname}:50090 - - - dfs.replication - 1 - - - dfs.namenode.name.dir - file:/usr/local/hadoop_store/hdfs/namenode - - - dfs.datanode.data.dir - file:/usr/local/hadoop_store/hdfs/datanode - - -``` - -##### slaves - -`slaves`文件里面存储的是作为 slave 的节点的机器名. 以行为单位,一行一个. 默认只有一行 localhost. 从一般的集群角度来说,Master 不应该担当 Worker 的角色(老湿布置作业给小学僧,自己是不会一起做作业的) 所以 slaves 文件一般只写 slave 节点的名字,即 slave 节点作为 datanode,master 节点仅仅作为 namenode. - -但是由于我是一名好老湿,所以在本机配置中 master 也充当了 worker 的角色,所以本机是这样改的: - -``` -elementary-os -hd-worker-a -hd-worker-b -``` - -致此,所有的配置文件已经修改完毕. 可以将 master 上的 hadoop 文件夹压缩并且分发到各个 slave 节点上. - -#### Last Configure : Format Namenode - -最后一步配置,初始格式化 hdfs - -```bash -$ cd /usr/local/hadoop/ -$ hdfs namenode -format -``` - -### Start all Hadoop deamons - -启动 Hadoop 服务. - -```bash -$ su hduser -$ cd /usr/local/hadoop/ -$ sbin/start-dfs.sh -$ sbin/start-yarn.sh -``` - -如果启动成功,在 master 节点上通过 jps 命令查看,应该包含如下 hadoop 进程 - -``` -hduser@elementary-os:~$ jps -51288 Jps -22914 ResourceManager -22361 NameNode -23229 NodeManager -22719 SecondaryNameNode -``` - -在 slave 节点上通过 jps 命令查看,应该包含如下 hadoop 进程 - -``` -hduser@hd-worker-a:~$ jps -6284 NodeManager -6150 DateNode -6409 Jps -``` - -或者可以通过浏览器访问[https://master:8088](https://master:8088) 或者[https://master:50070](https://master:50070) 查看 Hadoop 服务状态. - -![Nodes of the cluster](./nodes-of-the-cluster.png) ![Namenode information](./data-node-information.png) - -P.S.关于`jps`命令. jps 位于 jdk 的 bin 目录下,其作用是显示当前系统的 java 进程情况,及其 id 号. jps 相当于 linux 进程工具 ps,但是不支持管道命令 grep jps 并不使用应用程序名来查找 JVM 实例. - -## Trouble Shooting - -防跌坑指南. 记录了在 Hadoop 环境搭建过程中所遇到的坑 - -### Number of Live DataNode:0 - -通过`start-dfs.sh`启动了 hadoop 多个节点的 datanode, 且通过`jps`命令能够看到正常的 datanode 和 resourcemanager 进程, 为什么 live datanode 数目为 0,或者只有 master 的那个 datanode? - -可通过以下方法排除: - -1. 关闭所有节点的防火墙(ubuntu): 先查看防火墙状态 - -```bash -$ sudo ufw status -``` - -如果不是 disabled,则禁用 - -```bash -$ sudo ufw disable -``` - -2. 在 hadoop 服务运行的时候,关闭 namenode 的安全模式 - -```bash -$ hadoop dfsadmin -safemode leave -``` - -3. 在关闭 hadoop 服务的情况下,删除所有的日志文件,存储文件并重新 format 确保`hadoop_store`文件夹下的所有文件夹权限都是 777 - -``` -$sudo rm -r /usr/local/hadoop/logs -$sudo rm -r /usr/local/hadoop_store/tmp -$sudo rm -r /usr/local/hadoop_store/hdfs -$sidp hdfs namenode -format -``` - -## References - -在环境搭建的过程中,参考了以下两篇文章: 其中 Apache 的官方 Wiki 文档写的真难读. 建议直接先看一遍 aws 的指南再动手. - -[https://wiki.apache.org/hadoop/GettingStartedWithHadoop](https://wiki.apache.org/hadoop/GettingStartedWithHadoop) [https://rstudio-pubs-static.s3.amazonaws.com/](https://rstudio-pubs-static.s3.amazonaws.com/78508_abe89197267240dfb6f4facb361a20ed.html) diff --git a/data/posts/2016/04/10/google-codejam-2016-qualification-round.md b/data/posts/2016/04/10/google-codejam-2016-qualification-round.md deleted file mode 100755 index c84f8bfce..000000000 --- a/data/posts/2016/04/10/google-codejam-2016-qualification-round.md +++ /dev/null @@ -1,408 +0,0 @@ ---- -title: Google CodeJam 2016 Qualification -id: google-codejam-2016-qualification-round -created: 2016-04-10 -updated: 2016-04-10 -categories: - - Note -tags: - - Java - - Google -cover: ./cover.png ---- - -# Google CodeJam 2016 Qualification - -今早结束的 Google CodeJam 2016 资格赛. 由于智商问题和加班了一天,所以只能水出前面两道水题. 但是还是稍微涨了点姿势. 记录下解题的过程和一些小彩蛋. - -将我的 A 和 B 的 Solution 放在[Github](https://github.com/Aquariuslt/CodeJam)上了. - -## A: Counting Sheep - -### Problem Description - -Bleatrix Trotter the sheep has devised a strategy that helps her fall asleep faster. First, she picks a number N. Then she starts naming N, 2 × N, 3 × N, and so on. Whenever she names a number, she thinks about all of the digits in that number. She keeps track of which digits (0, 1, 2, 3, 4, 5, 6, 7, 8, and 9) she has seen at least once so far as part of any number she has named. Once she has seen each of the ten digits at least once, she will fall asleep. - -Bleatrix must start with N and must always name (i + 1) × N directly after i × N. For example, suppose that Bleatrix picks N = 1692. She would count as follows: - -N = 1692. Now she has seen the digits 1, 2, 6, and 9. 2N = 3384. Now she has seen the digits 1, 2, 3, 4, 6, 8, and 9. 3N = 5076. Now she has seen all ten digits, and falls asleep. What is the last number that she will name before falling asleep? If she will count forever, print INSOMNIA instead. - -Input - -The first line of the input gives the number of test cases, T. T test cases follow. Each consists of one line with a single integer N, the number Bleatrix has chosen. - -Output - -For each test case, output one line containing Case #x: y, where x is the test case number (starting from 1) and y is the last number that Bleatrix will name before falling asleep, according to the rules described in the statement. - -Limits - -1 ≤ T ≤ 100. Small dataset - -0 ≤ N ≤ 200. Large dataset - -0 ≤ N ≤ 10^6. Sample - -Input - -``` -5 -0 -1 -2 -11 -1692 -``` - -Output - -``` -Case #1: INSOMNIA -Case #2: 10 -Case #3: 90 -Case #4: 110 -Case #5: 5076 -``` - -In Case #1, since 2 × 0 = 0, 3 × 0 = 0, and so on, Bleatrix will never see any digit other than 0, and so she will count forever and never fall asleep. Poor sheep! In Case #2, Bleatrix will name 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. The 0 will be the last digit needed, and so she will fall asleep after 10. In Case #3, Bleatrix will name 2, 4, 6... and so on. She will not see the digit 9 in any number until 90, at which point she will fall asleep. By that point, she will have already seen the digits 0, 1, 2, 3, 4, 5, 6, 7, and 8, which will have appeared for the first time in the numbers 10, 10, 2, 30, 4, 50, 6, 70, and 8, respectively. In Case #4, Bleatrix will name 11, 22, 33, 44, 55, 66, 77, 88, 99, 110 and then fall asleep. Case #5 is the one described in the problem statement. Note that it would only show up in the Large dataset, and not in the Small dataset. - -### Translation - -这道题相当容易读懂,表面意思就是: 一个叫`Bleatrix`的家伙睡觉之前喜欢数羊咩,但是他要数到一定条件才睡得着. 他每次会从一个数字`N`开始数.第一下数`N`,第二下数`2*N`...第 M 下数`M*N`. 当从开始数到后面,一直到出现过的数字包含了`1234567890`所有数字的时候就会睡着了. 求的是数字`N`对应的让他能够睡着的那个数. - -### Solution - -做法是用一个从 N 开始枚举. 出现过的数字用`HashSet`来保存,每出现一个数字的时候,将该数字按照每一位拆分,打进这个`HashSet`里面. 当`HashSet`的长度大于等于 10 的时候跳出循环. - -### Source Code - -```java -package com.aquariuslt.codejam; - -import com.aquariuslt.codejam.utils.Reader; - -import org.junit.Test; - -import java.io.InputStream; -import java.util.HashSet; -import java.util.Set; - -/** Created by Aquariuslt on 4/9/16.*/ -public class CountingSheep { - private static int numberOfCases; - private static int startSheepNumber[]; - private static int result[]; - - - private static void input(){ - InputStream inputStream = ClassLoader.getSystemResourceAsStream("A/A-large.in"); - Reader.init(inputStream); - try{ - numberOfCases = Reader.nextInt(); - startSheepNumber = new int[numberOfCases]; - result = new int[numberOfCases]; - for(int i=0;i digitalSet = new HashSet<>(); - if(singleNumber==0){ - return 0; - } - else{ - int currentNumber = singleNumber; - while(digitalSet.size()<10){ - digitalSet.addAll(convertIntToDigitalSet(currentNumber)); - currentNumber += singleNumber; - } - return currentNumber; - } - } - - private static Set convertIntToDigitalSet(int number){ - int currentNumber = number; - Set digitalSet = new HashSet<>(); - while(currentNumber/10>0){ - digitalSet.add(currentNumber % 10); - currentNumber = currentNumber/10; - } - return digitalSet; - } - - private static void output(){ - for(int i=0;i= 0; i--) { - if ((pancakeArray[i] ^ revengeFlag) == 0) { //通过异或得出当前的面实际朝向. - revengeCount++; - revengeFlag = 1 - revengeFlag; - } - } - return revengeCount; -} -``` - -### Source Code - -```java -package com.aquariuslt.codejam; - -import com.aquariuslt.codejam.utils.Reader; - -import org.junit.Test; - -import java.io.IOException; -import java.io.InputStream; - -/** Created by Aquariuslt on 4/9/16.*/ -public class RevengePancakes { - private static final int MAX_STRING_LENGTH = 101; - - private int caseCount; - private int[][] pancakeIntArray; - private int[] result; - - private void input() { - InputStream inputStream = ClassLoader.getSystemResourceAsStream("B/B-large.in"); - Reader.init(inputStream); - try { - caseCount = Reader.nextInt(); - result = new int[caseCount]; - pancakeIntArray = new int[caseCount][MAX_STRING_LENGTH]; - for (int i = 0; i < caseCount; i++) { - String currentPancakeString = Reader.next(); - pancakeIntArray[i] = convertStringToInt(currentPancakeString); - } - } catch (IOException e) { - //e.printStackTrace(); - } - - } - - private int[] convertStringToInt(String currentPancakeString) { - int currentPancakeStringLength = currentPancakeString.length(); - int[] currentPancakeIntArray = new int[currentPancakeStringLength]; - for (int i = 0, strLength = currentPancakeString.length(); i < strLength; i++) { - currentPancakeIntArray[i] = currentPancakeString.charAt(i) == '+' ? 1 : 0; - } - return currentPancakeIntArray; - } - - private void solve() { - for (int i = 0, length = result.length; i < length; i++) { - result[i] = solveSingleCase(pancakeIntArray[i]); - } - } - - private int solveSingleCase(int[] pancakeArray) { - int revengeCount = 0; - int revengeFlag = 0; - for (int pancakeLength = pancakeArray.length, i = pancakeLength - 1; i >= 0; i--) { - if ((pancakeArray[i] ^ revengeFlag) == 0) { - revengeCount++; - revengeFlag = 1 - revengeFlag; - } - } - return revengeCount; - } - - private void output() { - for (int i = 0; i < caseCount; i++) { - System.out.printf("Case #%d: %d\n", (i + 1), result[i]); - } - } - - - @Test - public void testRevengePancakes() { - input(); - solve(); - output(); - } -} - - -/** - * if '-' means '0', '+' means '1' we can convert case to: Case 1: input : - 0 target: + - * 1 - * - * Case 2: input : -+ 01 target: ++ 11 - * - * Case 3: input : +- 10 target: ++ 11 - * - * Case 4: input : +++ 111 target: +++ 111 - * - * Case 5: input : --+- 0010 target: ++++ 1111 - */ -``` - -## Java Reader in ACM - -本来一直在用`Java的Scanner做input`. 但是一直没想过如果正式比赛还真的有人用 Java 去提交,那么`Scanner`的性能到底如何呢. 很久之前看过一篇文章比较`cin`和`scanf`的性能. 然后看到了这篇文章,通过数据比较高呼`Java Scanner is Slooooow` - -[Faster Input for Java](https://www.cpe.ku.ac.th/~jim/java-io.html) - -通过比较 Java 的`Scanner`与`BufferedReader` + `StringTokenizer`来比较性能的话. 证明了`Scanner`读入输入流相对要慢 4 倍. - -所以我在代码里面第一次使用了这种方式 - -```java -public class Reader { - private static BufferedReader reader; - private static StringTokenizer tokenizer; - - /** call this method to initialize reader for InputStream */ - public static void init(InputStream input) { - reader = new BufferedReader( - new InputStreamReader(input) ); - tokenizer = new StringTokenizer(""); - } - - /** get next word */ - public static String next() throws IOException { - while ( ! tokenizer.hasMoreTokens() ) { - tokenizer = new StringTokenizer( - reader.readLine() ); - } - return tokenizer.nextToken(); - } - - public static int nextInt() throws IOException { - return Integer.parseInt( next() ); - } - - static double nextDouble() throws IOException { - return Double.parseDouble( next() ); - } -} -``` - -然后将输入文件放在 resource 里面,将输入流直接改成 resource 即可. Usage: - -```java -public class CountingSheep{ - - private static void input(){ - InputStream inputStream = ClassLoader.getSystemResourceAsStream("A/A-large.in"); - Reader.init(inputStream); - try{ - numberOfCases = Reader.nextInt(); - startSheepNumber = new int[numberOfCases]; - result = new int[numberOfCases]; - for(int i=0;i -
    -
  • -
    -

    {{notifications.title}}

    -

    {{notifications.summary}}

    -

    {{notifications.time}}

    -
    -
  • -
-
-``` - -经过一轮生产环境数据统计,某部分用户的未读通知范围会在 200 - 7W 条. -哈哈看到就尿了,如果这么算的话,7W 条的那个用户页面将会有至少 7W\*4 = 28W 的 watcher 在监听他们的变化. -且不论数据为什么需要全部渲染出来,如果将代码修改成 Once Binding,则页面的长期 watcher 数量将会减少 28W 个. - -特别是对于使用了`ng-repeat`的元素,一定要考虑将 - -使用一次绑定表达式之后如下 - -```html -
-
    -
  • -
    -

    {{::notifications.title}}

    -

    {{::notifications.summary}}

    -

    {{::notifications.time}}

    -
    -
  • -
-
-``` - -### Use variable instead function expression - -之前遇到一个需求,在业务逻辑上需要显示一个模型,这个模型大概是下面这样的: - -```json -{ - "businessKeys": [ - { - "type": "a", - "value": "aValue" - }, - { - "type": "b", - "value": "bValue" - }, - { - "type": "c", - "value": "cValue" - }, - { - "type": "c", - "value": "cValue" - } - ], - "otherInfo": "otherInfo..." -} -``` - -在字段里面是有一个不定长的数组,数组里面实际上是一堆 key-value 形式的键值对. 之所以不定长是因为里面有时候有些 key 是没有对应的值的. 在 UI 上显示出来的时候,先前的做法就是绑定一个方法: - -``` -{{vm.getValueByBusinessKeysType(object,keyName)}} -``` - -在页面上使用一个方法表达式而不是直接的变量表达式的时候,会导致方法执行多次. 由于这个`getValueByBusinessKeysType`的方法,需要通过数组查找而不是直接一个 map 所以就会导致性能问题. - -目前的解决方案是:将数据在加载的时候经过扁平化处理,即将 key 直接以 property 的形式直接赋予 Object. 通过直接绑定 property 表达式来显示. 这样也有效提高了一些性能 - -### Chain Filter - -Angular 的 Filter 性能一直不够好. 在刚刚接触 Angular 的时候,阅读文档发现 Filter 的功能还挺好用的,特别是做一些关键字过滤表格数据,格式处理等方面的工作, -实在是太方便了,于是我们在为我们的 table 的 header 上每个 column 都添加了一个关键字过滤框,使用 angular 的 filter 做分页的工作. - -由于我们的表格需要显示的业务数据比较多,column 数大概在 15-25 左右. 在每一个 header 的 column 上添加独立的关键字过滤框,大概就添加了 20 个. - -假设当前页面的总数据 会有有 30 条. 用户喜欢在几个过滤框上输入一些相关的关键字信息过滤.(filterA,filterB) - -```js -function filterA(dataArray, filterCriteria) { - return filteredDataArrayByFilterCriteriaA; -} - -function filterB(dataArray, filterCriteria) { - return filteredDataArrayByFilterCriteriaB; -} -``` - -当每一个过滤框都属于一个单独的 filter 去绑定的话,如果执行 AB filter,将会按照下面的顺序执行 - -> `dataArray` length:30 `filteredDataArrayByFilterCriteriaA` (至少两次 filterA,此时 length 约 20) -> `filteredDataArrayByFilterCriteriaB` (至少两次 filterB,此时 length 约 5) - -如果我们在计算关键字过滤的时候使用的是遍历查询,以单次对单个元素对比的操作工作量为 1. -那么在这两重 filter 的总计算工作量就会变成`30*2+20*2=100`次 - -回到实际业务,通过在 filter 中添加 log 来记录 filter 循环运算的次数,惊讶的发现实际上 filter 的运算次数在 25 个 column 的情况下, -普遍一次过滤框的查询,会导致 3K 左右的运算次数,相当惊人. - -目前的解决方案是通过降低工程代码的可读性,将多个 filter 的功能合并成一个总的 filter,在总的 filter 里面处理一连串的单个 filter 过滤过程. - -之前的代码可能是这样: - -```html - -``` - -```js -function filterA(dataArray) { - //implement filterA -} - -function filterB(dataArray) { - //implement filterB -} - -function filterC(dataArray) { - //implement filterC -} -``` - -合并之后看起来是这样 - -```html - 1. 里面没有家具,需要自己购置 -> 2. 合同是跟华发物业签而非个人房东 - -这两点真的是太适合我了,一方面是租房合同可以随时终止,而且不用跟各种恶毒的个人房东打交道,二是新交的房子没有家具. 就不用担心租到不合适的房子看到家具风格不搭心里长草. - -从提交申请,审核,到正式通知抽签选房,签合同入住大概经历了快三个月. -三月初提交的申请,劳动节之后终于通知下来去抽签. - -一层的户型图大概如下图: 从结构上分,个人申请只能申请一房一厅或者单身公寓了. -![IMG_0916.png](./room-architecture.png) 抽到了个 08 户型,一房一厅,实用面积才 30 多. - -## 布置 - -### 空无一物的房间 - -签完合同刚拿到房子的时候,房间里面除了空调和稍微有点像样的厨房灶台之后,其他空无一物,显得比较空洞. -大概是这样子的: - -#### 空洞的过道 - -![空洞的过道](./passing-route.png) - -#### 空洞的客厅 - -![空洞的客厅](./meeting-room.png) - -#### 空洞的睡房 - -![空洞的睡房](./bedroom.png) - -#### 空洞的厨房 - -![空洞的厨房](./kitchen.png) - -#### 空洞的阳台 - -![空洞的阳台](./sunroom.png) - -### 购置刚需家具 - -距离之前租的房子大概还有半个月到期,购置刚需家具就成了午休时间的任务. -由于我已经有一张黑胡桃色的电脑桌,所以整体的家具都选用棕色/黑胡桃色吧. -根据刚需的优先级,大概列了一下: - -> 拐角电脑桌 >= 床 > 电脑椅 >> 书柜 >> 衣柜 - -家电的话,大概就只有 - -> 洗衣机 >> 电饭煲 - -### 安装与布置 - -陆续请了几个半天年假去办妥水电网络等各种开通手续之后,床和拐角电脑桌也送到了. 花了一个下午才把他们装好. 接着爸妈给我送来了窗帘和碗碟,还有一些大大小小的厨具~ - -五月下旬住进去之后一直没有买书柜和室内的晾衣架,直到前几天才送到,感觉现在就差个舒适的床垫就完美了~ - -下面是上图时间: - -#### 大门过道就加了一个垫子 - -![21:37:44.png](./passing-route-updated.png) - -#### 只有骨架的电脑桌 - -![21:39:14.png](./desktop.png) - -#### 上百螺丝还要反过来装 - -![21:51:48.png](./desktop-installation.png) - -#### 安装完毕 配上西昊 M18 P2415Q+2414H 强 无敌 - -#### 原来的电脑桌变成了饭桌 - -![21:57:22.png](./depracted-desktop.png) - -#### 室内衣架和一个杂物柜 - -![21:30:18.png](./clothes.png) - -#### 低到类似榻榻米的床与床头柜 - -![21:28:49.png](./bed.png) - -#### 楼再高一点就是海景啦 - -![22:01:15.png](./see-sea.png) - -## 憧憬 - -配置齐全 感觉每天晚上专注 ~~学习~~ 的时间变长了. `wakatime`的 report ![22:10:03.png](./wakatime.png) - -![22:19:24.png](./wakatime-total.png) - -希望在如此好的环境中,学习和工作效率会变得越来越高~ diff --git a/data/posts/2017/01/01/review-2016.md b/data/posts/2017/01/01/review-2016.md deleted file mode 100755 index d9fb7ebca..000000000 --- a/data/posts/2017/01/01/review-2016.md +++ /dev/null @@ -1,64 +0,0 @@ ---- -title: Year in Review 2016 -id: review-2016 -created: 2017-01-01 -updated: 2017-01-01 -categories: - - Others -tags: - - Diary -cover: ./cover.png ---- - -# Year in Review 2016 - -## The Year Not bad - -> 今年只能算是不差的一年. - -下半年没怎么更新博客,因为一直打算在用`angular2`+`SpringBoot Series`来更新下一代系统. - -反复修改都赶不上最新的 Dev Guideline.. - -后来`Material2` beta component 也越来越多了. - -家里也有一些烦心的事情,几度接近崩溃. - -(并不是因为 7.0 开了好吗) - -## Study Progress - -新年定下好几个目标, 加上公司制定的一些学习计划. 到现在只能说完成了一半,有一部分没有达到所谓的进阶的目的.(花的时间应该差不多了,但是效率奇低,没有质变的提升) - -硬要从语言层次划分的话 - -- JS/TS \*\*\*\* -- Java \*\* - -接下来的半年计划,应该是以完成去年定下的学习目标为主 - -## New PC - -一开始换了 4K 显示器的时候,老 PC 的 GTX460 估摸着不能输出 2K 以上分辨率的信号. 适逢老黄 1000 系列的卡横空出世,看到 GTX1060 以相同的功耗性能怒草上一代 GTX980 的评测,一下长草就下单了个 1060. - -结果回来插在 H61 主板上面,把电源和主板炸了.. - -真是"3000 预算进卡吧 四路泰坦带回家"的节奏.立马把 1060 退了,老的 CPU 出了二手,准备下单新 PC. - -结果就变成这样了... - -老黄坑了一把,说好的新架构 Mac WebDriver 呢? - -我一直在等待 WebDriver 的出现带动黑苹果 - -![6700K+32G+GTX1070](./hardware-info.png) - -## Google Pixel - -新一代亲儿子,贵的出汁. - -除了三星 P 拍 AMOLED 屏 比 Nexus5 的 IPS 观感要差之外,其他完美. - -## Summary - -On the way go. diff --git a/data/posts/2017/03/12/new-version-blog-migration.md b/data/posts/2017/03/12/new-version-blog-migration.md deleted file mode 100755 index 9813766e7..000000000 --- a/data/posts/2017/03/12/new-version-blog-migration.md +++ /dev/null @@ -1,71 +0,0 @@ ---- -title: Blog Structure Update -id: new-version-blog-migration -created: 2017-03-12 -updated: 2017-03-12 -categories: - - Blog -tags: - - Blog - - Angular -cover: ./cover.png ---- - -# Blog Structure Update - -## Background - -最近正在接触学习`Angular2`+`RxJS`相关知识.当`Angular`发布了`@angular/cli`之后, 内置的 webpack 工作流程提供了一个官方推荐的比较完整的编译,打包,配置切换的工作流程, 使得其在工程化方面显得有板有眼,愈发被我所接受. - -于是使用`Angular2`+完成度极低`@angular/material`重写之前的 Blog 框架. - -## Features - -基于`Angular2`所提供的解决方案,相对之前用`Angular1`版本,提供了如下新功能:(包括但不限于 Angular 本身) - -1. 支持自己文章的一些新定义的 metadata. 比如有些文章属于特殊的 category,不会显示在首页上. 只有从 category 下面进去看到对应的文章. -2. 新的边框主题色,通过在 HTML header 里面声明`theme-color`来实现. -3. 支持渐进式网页应用 [Progressive Web Apps](https://developers.google.com/web/progressive-web-apps/). 通过现代的移动浏览器,已经可以添加一个离线的快捷方式查看. - -通过 Chrome 浏览的时候提示可以添加到桌面 - -![通过Chrome浏览的时候提示可以添加到桌面](./add-shortcut.png) - -桌面版本也可以可以添加快捷方式,像桌面版本的 Google Keep 一样 - -![桌面版本快捷方式](./desktop-version-shortcut.png) - -添加成功后桌面会有一个快捷方式 - -![添加成功后桌面会有一个快捷方式](./shortcut.png) - -打开快捷方式会有一个自定义的启动动画 - -![打开快捷方式会有一个默认的启动动画](./launching.png) - -``` -颜色,图表可通过PWA提供的manifest.webapp来配置 -``` - -此外,还有如下变化: - -1. 通过[CloudFlare](https://www.cloudflare.com/)提供的免费 SSL 证书 全站 https. -2. 修改了构建出来的文章形式,返回的文章信息以过滤后的 token 形式,数据文件大小更小了. - -## TODO - -1. 修改了解析后的文章段落的标题,添加一个 TOC 的实现 -2. 添加代码块的语法高亮 -3. 添加大量过场动画 -4. 添加 GFM,FlowChart 的支持 - -## Finally - -代码位于[https://github.com/Aquariuslt/Site](https://github.com/Aquariuslt/Site) 的新默认分支`NG2`下. - -之前版本的文章与内容,还是保留在[https://aquariuslt.com](https://aquariuslt.com)中. - -``` -文章的源文件内容和Schema并没有大的改动,只是在源代码里面加多了一些metadata的解析工作. -正在准备逐步迁移过来,并且删除掉那些经过自我检讨之后没有什么卵用的垃圾文章. -``` diff --git a/data/posts/2017/05/05/vue-version-for-blog-app.md b/data/posts/2017/05/05/vue-version-for-blog-app.md deleted file mode 100755 index 806d35c3d..000000000 --- a/data/posts/2017/05/05/vue-version-for-blog-app.md +++ /dev/null @@ -1,84 +0,0 @@ ---- -title: Vue Version Blog App -id: vue-version-for-blog-app -created: 2017-05-05 -updated: 2017-05-05 -categories: - - Blog -tags: - - Blog - - Vue - - PWA -cover: ./cover.png ---- - -# Vue Version Blog App - -## Change Log - -这篇文章写于 2017-05-05. 下面一部分没实现的功能已经基本实现,并且做了更多的配置外化工作. - -详情将会发布到新的一篇文章里面. - -## Background - -在阅读过 Vue 的官方文档之后,我尝试用其为一个数据可视化项目的图表做一个 Refine,以寻求渲染性能与响应变化上性能的提升,与更细致,可自定义的动画效果. - -虽然 Vue 是一个渐进式的前端框架,但是突然想以 Vue 全家桶去实现一次 Angular1.x 项目中所有的功能,于是便以自己的 Blog App 作为一个初始项目进行练手. - -从四月份开始进行 Vue 的学习,目前 Vue 版本的 Blog App 已经实现了[@Angular 版本](https://github.com/Aquariuslt/Blog/tree/NG2)的所有功能. - -记录一下中间的历程. - -根据目前所做的工作, - -代码放在[Vue 分支](https://github.com/Aquariuslt/Blog)上. - -## Features - -目前实现的功能有: - -- Single Page Application [单页应用] -- Progressive Web Application [渐进式网页应用] -- Markdown Writing [使用 Markdown 进行写作] -- Support Code Highlight [支持代码高亮] -- Disqus [支持 Disqus 评论] -- Configurable [抽取配置到独立的配置文件] -- Sitemap auto generating [自动生成 Sitemap] - -中间有一些跌坑之后还在纠结于没找到优雅的解决方案的地方: - -- No Support Pre-rendering [不支持预渲染] - > 为单页应用进行预渲染,生成对应的静态 index.html,可以有效被搜索引擎收录 Vue 本身支持 webpack 的`prerender-spa-plugin`. 但`Vue-Material`的菜单展开方式是动态渲染的,所以目前还不能做到预渲染.(这里跌了几天的坑) 目前部署在 Github Pages 上的话会没有 SEO. 因为 SPA 在搜索引擎爬的时候会先返回一个 404,再根据 Github 的约定返回 404.html. 搜索引擎就把该 url 当成失效的链接. 部署在 VPS 上的话支持 SEO. - -## Development - -### Dependencies - -为了实现与 Angular 版本相同的效果,才用的 Vue 全家桶 + 其他主要的库是 - -- Vue [2.3.2] -- Vuex -- Vue-Router -- VueMaterial -- Axios [前后端通用的 http 请求框架] -- Marked [Markdown 解析部分] -- Hightlight.js [为 Markdown 的代码片段渲染出高亮效果] - -### Development Course - -从头到尾,大概的功能开发思路是如此的: - -1. 阅读 Vue + Vuex + Vue-Router 的文档 -2. 学习 Vue-Webpack Template 中的项目结构与构建方式 -3. 重写基于 Marked 的 Markdown Post API -4. 确定基本的 Gulp 构建任务流 -5. 以纯 ES6 的方式修改 Webpack 与 Gulp 任务流 -6. 使用 Vue 全家桶完成基本界面开发 -7. 重构应用部分的代码成模块化加载方式 -8. 添加 PWA,Sitemap 等功能 -9. 添加 CI 配置 - -## Usage & Document - -参见: [Blog App Usage](https://github.com/Aquariuslt/Blog/tree/VUE#usage) diff --git a/data/posts/2018/01/01/review-2017.md b/data/posts/2018/01/01/review-2017.md deleted file mode 100755 index a75d05d9f..000000000 --- a/data/posts/2018/01/01/review-2017.md +++ /dev/null @@ -1,67 +0,0 @@ ---- -title: Year in Review 2017 -id: review-2017 -created: 2018-01-01 -updated: 2018-01-01 -categories: - - Others -tags: - - Diary -cover: ./cover.png ---- - -# Year in Review 2017 - -## Busy Year - -### Setup Frontend JS UnitTest framework for legacy code - -之前一直想为公司的一个主力项目的前端添加单元测试流程,由于是跟后端项目耦合度相对较高的结构。在基于各大前端项目的单元测试方案 和 项目使用的后端服务前端资源解决方案之上,加上自己编写了一个帮助转换的 karma 测试插件,终于把这个测试流程较为优雅地落地。 - -### More practice in Docker - -今年上半年做过[hyperledger](https://github.com/hyperledger)的 POC。通过在本地模拟一些集群和网络设定,实践了一些 Docker 的基础知识 - -接下来后面对团队内部的一些 CI 和容器化实践中,也稍微实践了一些容器化的其他方向应用。具体应该能写比较多的实践记录,后面会补上。 - -### Became Technical Conferences Committee - -得益于公司年底请来了 Thoughtworks 团队,Thoughtworks 团队的咨询师给公司发展技术社区的氛围提出了很多油可行性高的意见,其中一件事就是举办技术嘉年华一类的活动,以带动、增进公司的技术氛围。第一届的技术嘉年华我是里面 Committees 的其中一位,在负责事件策划,演讲题目 Review,等方面都可以提出自己的意见。 - -## Devices - -今年在消费电子上投入也是挺大的,毕竟没有其他刚需。 - -### RMBP 15 - -![Macbook Pro](./macbook-pro.png) 出了之前的小 RMBP 入手了顶配+定制显卡的 15 寸新 RMBP。 - -感谢猴子聚聚的员工优惠,可惜下单的时候没注意写成了中文键盘。目前用起来一切完美,就是没出 32G 内存有点可惜。 - -### SONY MDR-1000x - -6 月份入手第一款降噪耳机。 ![MDR-1000X](./mdr-1000x.png) - -舒适程度降噪效果比之前的好太多了。目前评价挺高 - -### IKBC DC 108 Wireless - -年度最不值花钱消费之一。 - -一直对 IKBC 这个双模+TypeC 的键盘有个小期望。毕竟市场上做蓝牙机械键盘的不多,而且 Type-C 接口的是第一款能买到的。 - -可惜买到之后发现是失望比较多,先是因为京东的原因给我发错了一款黑轴的,开箱按了一下像吃了屎,立刻联系换货正确版本到了之后,在 Mac 和 Win 蓝牙连接下经常有掉键的现象发生。特别是盲敲密码的时候,甚是恼火。 - -## Summary - -关于去年一年的收获,其实在写这篇的时候还多想了很多个点。不过后面还是可以随时修改内容,也就先放简要的写这么多。 - -过去的一年排期比较满了,但是从技能增长的情况来看,比较接近目标。给自己一个 3.75 先,再接再厉。 - -## Target & Plan - -在新的一年里面,先制定几个非技术向的目标: - -- 多运动,定制运动目标 -- 读点非技术向的书 -- 加强理财技巧 diff --git a/data/posts/2018/03/03/latest-update-on-blog-app.md b/data/posts/2018/03/03/latest-update-on-blog-app.md deleted file mode 100755 index dc9cfc2e0..000000000 --- a/data/posts/2018/03/03/latest-update-on-blog-app.md +++ /dev/null @@ -1,86 +0,0 @@ ---- -title: Latest Update on Blog App -id: latest-update-on-blog-app -created: 2018-03-03 -updated: 2018-03-03 -categories: - - Blog -tags: - - Vue - - JavaScript - - Webpack - - Karma - - Gulp - - Github - - Blog -cover: ./cover.png ---- - -# Latest Update on Blog App - -结合最近学到的一些知识,了解的一些规范,和实践过的一些新姿势,重构了 Blog 的整个应用。目前 Vue Branch 版本从`4.0.0-beta` 到了`4.0.1` ,算是可以标记 release 的一个版本了。 - -[项目地址](https://github.com/aquariuslt/blog)不变。(除了最近更新过一次 Github account 的 url,开头从大写变成小写,对其他第三方服务迁移的时候有点麻烦)。 - -## Refactor Background - -其实是很少符合很多最佳实践 - -- 测试流程不完整 -- 代码抽象结构不够好 -- 依赖升级不够及时 -- Markdown 功能化不够完整(其实这次重构花了很多时间都没有做好) -- 之前 Gulp 的部分功能 发现 Webpack 已经有比较成熟的方案可以实现,需要替换 - -## New Features - -目前添加的新功能: - -- 提供了`feed.xml`,支持 RSS 功能 -- 支持 Github Pages SEO -- 支持国际化 -- 同时支持[Travis-CI](https://travis-ci.org/aquariuslt/blog/),[Circle-CI](https://circleci.com/gh/aquariuslt/blog) 构建和发布 -- 将覆盖率报告展现在[coveralls.io](https://coveralls.io/github/aquariuslt/blog)中 - -## Structure/Design/Dependencies Update - -目前代码结构/框架选型/测试流程上的改进包括好几个方面 - -### Config Design - -- 修改了入口配置文件`application.yml`的 schema,结构更加扁平 -- 抽离`google site verification`到配置文件里 - -### Build Flow - -- 将读取`*.md`文件的工具类,从 Gulp Tasks 中抽取出来,目的是将来该工具类可以单独分离成一个模块。目前该模块功能是根据源代码生成合适的 api-schema 的内容,并且在`marked`的功能上做了一个 wrapper,实现自定义 header,自定义代码区块高亮,自定义 id 生成样式等功能 -- 添加生成符合 RSS 规范的相关文件的 Gulp Tasks -- 添加测试相关的 Gulp Tasks -- 添加生成静态子页面的 Gulp Task - -### Testing Flow - -- 目前的测试流程,与语言框架无关的部分,主要是使用`karma` + `webpack` + `mocha` + `sinon` + `chai` +`puppeteer`来构建整体的测试流程。 -- 覆盖率报告方面,主要是使用了`karma`的`spec coverage reporter`插件 来生成较为通用的`lcov.info`报告文件,方便与各大开放的覆盖率报告平台集成。 -- 与语言框架相关的部分,根据 beta 版本的`@vue/test-utils` 的官方推荐单元测试编写方法改写了各大组件的测试前置代码(最新的几个版本坑有点多,大概是与那些在代码里面强行加入了 SSR 的检测之类的改动导致体验挺糟糕的) -- 使用了`moxios`这个`axios`官方的 mockup lib 来做模拟 http 请求方面的测试 - -### Dependencies/3rd Party Lib Selection - -- PWA 相关配置方面,移除了从利用`sw-preache`构建出 PWA 相关文件的 Gulp,改而采用`offline-plugin` + runtime 模式来切分开发环境与生产环境的的加载。 -- webpack 和相关官方插件升级到 3 的最新版(重构之间 webpack 还发布了 4,但是有点破坏性的改动还没找到合适的替代方案,所以暂时观望和调试中) -- 添加`vue-i18n`做国际化 -- 升级 babel 版本到 6 的最新版,重新配置了 babel 相关的配置文件,统一到较合适的阶段 -- `vue-material`也升级到了 v1.0-beta 版本,为新的 API 修改一轮代码 - -## Benchmark - -目前首次页面加载总共需要 280KB 的流量。 ![blog-resource-transfer-time](./blog-resource-transfer-time.png) - -![benchmark-blog-website](./benchmark-blog-website.png) 重新用 Chrome 的 Audits 工具做了一次测试(中间有根据提示的一些最佳实践准则进行优化) 之后,主要痛点是首次渲染页面时间比较长。 - -## TODO - -- 重新思考 Markdown 转译工具部分代码的选型(当初选型用 marked 真是真是后悔,完全是照搬 perl 实现) -- 补全没写的一些单元测试样例 -- 更新`vexo-cli`提供详细的文档和遵循设计规范的背书说明 diff --git a/data/posts/2018/03/04/karma-based-traditional-java-web-frontend-unittest-solution.md b/data/posts/2018/03/04/karma-based-traditional-java-web-frontend-unittest-solution.md deleted file mode 100755 index a35a6a0a8..000000000 --- a/data/posts/2018/03/04/karma-based-traditional-java-web-frontend-unittest-solution.md +++ /dev/null @@ -1,508 +0,0 @@ ---- -title: 基于Karma的非分离式前端单元测试基础方案 -id: karma-based-traditional-java-web-frontend-unittest-solution -created: 2018-03-04 -updated: 2018-03-04 -categories: - - Blog -tags: - - Java - - JAWR - - Karma - - JavaScript - - Webpack - - ExtJS - - Spring - - JSF - - SpringMVC -cover: ./cover.png ---- - -# 基于 Karma 的非分离式前端单元测试基础方案 - -TL;DR - -## Background - -之前在为公司一个稍微有些年头的核心系统的代码寻找一个合理的单元测试方案,在摆弄了一段时间后,目前奠定了一个基于 Karma 的前端单元测试方案。 - -如果你的项目符合以下条件,那么这个解决方案和其中的思路也许能对你的项目有点帮助。 - -原本的项目与前端相关的部分属于 Java Web 项目,抛开与本次主题无关的部分,具体影响单元测试方案选型和落地的几个因素,我把他归结成几类: - -- 原本的前后端框架选型 -- 在原本的前端代码中,是否具有可见的可测试的单元 -- 在基础的前后端相关框架中,随着时间的变迁,是否有不合理的写法、运用导致在通往可被测试的过程中需要做较大量的代码改动 -- 单元测试方案与流程,是否拥有持续集成的能力 - -### Existing Platform & Technical Selection - -先裸列出项目中与前端相关的,和后端有分离不开的技术点,后面再来看如何一步一步处理这些问题。 - -那么在没有引入前端测试解决方案之前,项目中使用到的技术栈就是: - -- Maven (Java 端构建相关,需要关注其构建过程是否用了影响前端的静态资源文件生成路径) -- JAWR(曾是 Java 官方社区维护的一个前端资源解决方案) -- SpringMVC (提供了相关的 DispatcherServlet,和一个 JAWR 的 i18n 方案) -- JSF(与前端资源在页面加载方案有关,因为 jawr 提供了一系列的 JSTL/facelets tags) - -为此我根据项目中前后端在测试方面未能被解耦的情况,抽离出一个最小化能体现这几点技术的一个样例项目项目地址: [karma-jawr-sample](https://github.com/aquariuslt/karma-jawr-sample) - -> 为了方便看到项目之前的样子,我给他还没引入单元测试的过程里面打了一个 tag: -> [https://github.com/aquariuslt/karma-jawr-sample/releases/tag/no-frontend-unittest](https://github.com/aquariuslt/karma-jawr-sample/releases/tag/no-frontend-unittest) -> 可以在这里看到当时的一个可运行的一个版本。 - -#### References: - -- [jawr-main-repo](https://github.com/j-a-w-r/jawr-main-repo) 目前的 jawr 官方源代码 repo -- [jawr i18n message generator](https://j-a-w-r.github.io/docs/messages_gen.html) jawr 的前端国际化方案(带与 SpringMVC 集成文档) -- [jawr-quickstart](https://j-a-w-r.github.io/tutorials/quickstart.html) jawr 的官方文档首页 - -> TODO:添加一份相关的前端资源请求解析流程图,正在勾画 ing - -### Existing Frontend Core Framework Test Support - -在稍微了解了项目当前使用到的前端相关选型之后,就要开始思考几个问题: - -1. 在当前项目中使用的框架 本身是具有可模块化,测试化的思想吗? -2. 在 1 确定的前提下,是否会出现,随着时间的发展,为项目贡献代码的过程中出现了错误的使用方法,导致越来越难以测试? -3. 如何可以将目前主流框架的单元测试框架和手段应用到项目中?还是要自己造个轮子? -4. 这个流程设计得具有通用性吗,对于其他使用类似技术栈的项目,是否可以快速应用上去? - -#### ExtJS + JAWR Modularize + Component Based Development - -历史的车轮滚滚前进,不同时代的项目技术选型也都都有当时的前瞻性。 - -幸运的是, 项目在这方面的技术选型的时候,当时的前辈应该是考虑到了几点 - -- **jawr** 为源代码模块化提供了基础,正因为源代码能够被模块化,测试的单元至少可以限定在一小部模块(取决于实际使用情况) -- **ExtJS** 无论是项目使用到的 3/4 本身已经是组件化开发思想的一个先祖了,在代码写的最乱的情况我们也能够将一个页面的整个 layout 当成是一个大组件,测试单元从真正的单一最小化组件变成一个少复杂的大组件而已。一旦可被测试,后面的测试思想便能够引导整个开发团队接纳运用这方面的思想。 - -结论: 我们就认为在最原始的项目结构里面,前端部分是可以被单元测试的。 - -#### Wrong Usage Since Long Long Ago - -项目代码的发展啊,当然要框架本身思想牛逼,但是也要考虑到历史的行程,那就是会不会出现各种滥用的情况,导致代码结构絮乱,为了完成需求各种邪门歪道奇技淫巧,而不遵循正确的开发手段。 - -我稍微分析了下项目有哪些反模式的地方,加大了可被测试的难度(这部分后面会用一些过渡类型的手段来补救,但终究不属于合理 CRUD 的做法) - -- 页面之间传值大量通过全局变量做引用(有隐式提升的全局变量,也有刻意为之的全局变量) -- 实际页面在加载的时候,会用到第三方的,页面运行时才加载的,的其他 JS 代码提供释放出的变量/方法(比如版本更新比较快的内部框架) - -还有一些属于并非反模式,但是加大了前后端耦合度的: - -- jawr i18n message generator 会在运行时提供释放一系列全局函数,执行之后才返回当前对应文本的对应语言版本。需要被测试的时候,我们必须有一个不依赖任何后端服务器的 能够根据配置国际化配置文件来模拟 java 版本实现,生成同样的全局函数的手段。 - -### Situations Blocking Writing UnitTest - -Block 住单元测试执行的情况,大部分都是由于业务代码的问题,少部分是 ExtJS 操作 CSS 动画的问题。这部分在设计测试框架及其流程的时候没有先考虑到,需要根据实际情况做调整。 - -> TODO: 后面会持续举例子 - -### Is It Easy to Understand - -假设我提出了一个单元测试的技术选型和对应的流程,那么编写测试代码的时候的开发体验如何,无疑会影响大家后面持续自发编写测试用例的激情。 - -为了提高整个单元测试框架和流程的说服力,我觉得符合以下特点越多 越能够被人接受: - -- 单元测试技术选型必须有主流测试框架作为背书 -- 单元测试框架组合程度相对较高 -- 绝对不能依赖后台运行时服务,可以真正的单独运行 -- 有可持续更新的文档来对应各种应用场景,防止为了测试写测试,或者其他反正确实践手段 - -下面 **Design & Benefits** 这一章,会描述选型背后的一些顾虑和我眼中的亮点。 - -## Design and Benefits - -### Design Background - -我对前端的单元测试的认识,大概是从 2016 年开始,一方面是当时的几大框架 比如`Angular 2`,`React`,`Vue` 有一些比较流行的手脚架,提供了基本的测试框架,和完善的最基础情况的单元测试 example,帮助我在确立目前项目的前端单元测试方案中提供了很多正确的思路) - -(感谢后来的`angular-cli~@angular/cli`,`vue-cli`,`create-react-app`背后的相关的 template 项目,提供了多种测试方案的 example) - -在对比了一些用过的前端单元测试的 Test-Runner 譬如[karma](https://karma-runner.github.io/), [ava](https://github.com/avajs/ava),[jest](https://facebook.github.io/jest/),[jasmine](https://jasmine.github.io/) 之后 - -目前是选用了一套以`karma`为基础的测试方案。中间为了提升编写测试代码的体验,配合`webpack`和一个 karma 插件[karma-jawr](https://www.npmjs.com/package/karma-jawr) - -下面这部分,会描述实际用到的测试相关的 lib 及其作用 - -### Test Framework Selection - -可以先看看整个`package.json`里面单元测试相关的 lib [样例](https://github.com/aquariuslt/karma-jawr-sample/blob/master/package.json) - -```json -{ - "name": "karma-jawr-sample", - "version": "1.0.4", - "description": "spring + jawr + extjs sample project with unittest using karma-jawr", - "repository": "https://github.com/aquariuslt/spring-jawr-ext.git", - "author": "Aquariuslt ", - "license": "MIT", - "keywords": ["extjs", "ext3", "spring", "jawr", "jsf", "karma", "mocha"], - "scripts": { - "test": "gulp test" - }, - "devDependencies": { - "@types/chai": "^4.1.2", - "@types/extjs": "^4.2.32", - "@types/lodash": "^4.14.104", - "@types/mocha": "^2.2.48", - "@types/sinon": "^4.3.0", - "ajv": "^6.2.1", - "chai": "^4.1.2", - "coveralls": "^3.0.0", - "css-loader": "^0.28.10", - "file-loader": "^1.1.11", - "gulp": "^3.9.1", - "gulp-sequence": "^1.0.0", - "istanbul": "^0.4.5", - "istanbul-instrumenter-loader": "^3.0.0", - "karma": "^2.0.0", - "karma-chai": "^0.1.0", - "karma-chrome-launcher": "^2.2.0", - "karma-coverage": "^1.1.1", - "karma-coverage-istanbul-reporter": "^1.4.1", - "karma-firefox-launcher": "^1.1.0", - "karma-iframes": "^1.1.1", - "karma-jawr": "^0.1.12", - "karma-junit-reporter": "^1.2.0", - "karma-mocha": "^1.3.0", - "karma-sinon": "^1.0.5", - "karma-sourcemap-loader": "^0.3.7", - "karma-spec-reporter": "^0.0.32", - "karma-webpack": "^2.0.13", - "lodash": "^4.17.5", - "mocha": "^4.1.0", - "mocha-lcov-reporter": "^1.3.0", - "puppeteer": "^1.1.1", - "sinon": "^4.4.2", - "style-loader": "^0.19.0", - "url-loader": "^0.6.2", - "webpack": "^3.11.0" - } -} -``` - -值得提到的相关 lib 是: - -- karma: test-runner 本身 -- karma-chai, karma-chrome-launcher, karma-coverage 等等等以 `karma-` 作为开头的 便是 karma 与其他框架集成的相关框架 -- mocha -- chai 提供测试断言相关 API -- sinon 提供 mock 相关 API -- puppeteer 提供 headless Chrome 的 node.js API 可以在 CI 服务器上方面的提供浏览器环境 -- webpack 及其相关 loader 通过 webpack + 各种 loader 可以方便的引用各种测试家具(fixture), 生成 sourcemap,和根据项目实际情况各种忽略规则。 - -每个 lib 单独使用起来都能够稍作文章,但是最终要的就是这些测试用到的相关 lib,都是可以自由组合的,这也是使用 karma 作为单元测试流程基础的一部分。 - -### Diagram - -具体的测试执行流程 其实都可以通过项目里面的`karma.conf.js`来定义。 - -这里以样例项目代码的`tasks/config/karma.conf.js`来描述一下这个项目在启动测试步骤的时候,经过了些什么。 - -#### karma.conf.js - -```javascript -var webpackTestConfig = require('./webpack.test.config'); -var pathUtil = require('../utils/path.util'); - -var puppeteer = require('puppeteer'); -process.env.CHROMIUM_BIN = puppeteer.executablePath(); - -module.exports = function(config) { - config.set({ - logLevel: config.LOG_DEBUG, - customLaunchers: { - ChromiumHeadlessNoSandbox: { - base: 'ChromiumHeadless', - flags: ['--no-sandbox'] - } - }, - browsers: ['ChromiumHeadlessNoSandbox'], - plugins: [ - 'karma-chrome-launcher', - 'karma-chai', - 'karma-mocha', - 'karma-spec-reporter', - 'karma-coverage', - 'karma-coverage-istanbul-reporter', - 'karma-sourcemap-loader', - 'karma-sinon', - 'karma-webpack', - 'karma-jawr' - ], - frameworks: ['jawr', 'mocha', 'sinon', 'chai'], - files: [pathUtil.resolve('src/test/js/unit/specs') + '/**/*.spec.js'], - reporters: ['spec', 'coverage-istanbul'], - preprocessors: { - '/**/*.spec.js': ['webpack', 'sourcemap'] - }, - jawr: { - configLocation: pathUtil.resolve('src/main/resources/jawr/') + 'jawr.properties', - webappLocation: pathUtil.resolve('src/main/webapp'), - targetLocation: pathUtil.resolve('src/test/js/build'), - localeConfigLocation: pathUtil.resolve('src/main/resources') - }, - webpack: webpackTestConfig, - webpackMiddleware: { - stats: 'errors-only', - noInfo: true - }, - coverageIstanbulReporter: { - dir: pathUtil.resolve('src/test/js/unit') + '/coverage', - reports: ['html', 'lcovonly', 'text-summary'], - fixWebpackSourcePaths: true, - skipFilesWithNoCoverage: true, - thresholds: { - emitWarning: false, - global: { - statements: 1, - lines: 1, - branches: 1, - functions: 1 - } - } - } - }); -}; -``` - -1. 在启动 karma 服务器的时候,读取这个`karma.conf.js`来加载配置。 -2. 如果在 config 里面 没有`plugins` field,则会自动扫描并加载所有 packge.json 里面定义的,以`karma-`开头的,符合`karma-plugin` 依赖注入规则的插件。如果有,则之加载`plugins` field 里面定义的插件。 -3. 接着我们定义一个`files`数组,里面数组的每一行都可以使用 [unix glob style path patterns]() 描述我们定义的所有的单元测试文件。 - -这里是`pathUtil.resolve('src/test/js/unit/specs') + '/**/*.spec.js'`,意为扫描的是在`src/teest/js/unit/spec`文件夹及其子文件夹下,所有以`.spec.js`为结尾的文件。 - -4. 接着我们定义一个`browsers` field,表示 karma 服务器启动之后,将会根据`browsers`中定义的浏览器名字,通过对应的`karma-${browsers-core}-launcher`提供 API 来唤起对应的浏览器,在运行时候把上面`files` field 定义的所有测试文件加载到所启动的浏览器的单一 tab 中。 - -5. 浏览器直接加载那些`**/*.spec`类型的单元测试代码就可以了吗?如果用到了一些 CommonJS 语法 来编写单元测试,或者你想方便的加载一些测试家具,比如离线加载一些原本在运行时才能被加载的第三方 css,或者为了 mockup 返回的使用 json/文本形式保存的模拟的业务数据返回值... 等操作 - -那么推荐的做法是在 `preprocessors`里面通过 karma 提供的 preprocessor API,结合第三方 processor 插件,来对单元测试的源代码做一个预处理的过程。 - -这里貌似有点拗口,我们通过加与不加`preprocessors`的时候的一个比较来说明两种情况的区别。 - -`base.spec.js` - -```javascript -require('@/jsBundles/extJs.js'); -require('@/jsBundles/home.js'); - -describe('ext', function() { - before(function() { - Ext.onReady(function() { - Ext.QuickTips.init(); - new agile.example.app.Home({ - renderTo: Ext.getBody() - }); - }); - }); - it('# check extjs is loaded', function() { - var expectExtVersion = '3.3.1'; - expect(Ext.version).to.eq(expectExtVersion); - }); -}); -``` - -在没有`preprocessors`的情况,浏览器直接把`base.spec.js` 加载到 karma server 启动的浏览器页面中。由于不能识别代码里面的 CommonJS 语法而抛出错误,同样第,因为没有加载到 ExtJS 的源代码文件,也会抛出错误。 - -添加了`preprocessors` 里面`karma-webpack`,`karma-sourcemap-loader`,和添加了相关这些预处理器的相关插件的时候,在 karma server 启动的浏览器页面中,由于加载过的是被`webpack`解析构建好之后的 bundle 文件,则能够正确的按需加载所有需要加载的 js 文件。 - -在这里`karma-webpack`所提供的配置选项是`webpack`和`webpackMiddleware`两个 option,告诉了使用对于单个单元测试文件,使用哪个 webpack 的配置文件来解析单元测试源代码。 - -有关 karma 相关插件的开发,和这次为了解耦开发的`karma-jawr`插件,将会在另外一篇文章里面详解。 - -6. 那么测试浏览器加载的所有单元测试文件,被当前用到的`mocha`框架解析并执行对应的测试代码,执行之后,我们想知道单元测试的完整覆盖率,那么我们要怎么做呢? - -那就是`reporters`这一个 field。 - -在`reporters` 里面定义的相关的报告生成器,他们实际上是把对应的测试框架的报告功能统一管理了,执行基于什么类型的单元测试框架,这个单元测试框架如果要统计并展现覆盖率,应该提供哪些配置细节,都在`reporters`里面定义相关的`karma-reporter`插件,并根据该插件要求的配置,来生成对应的覆盖率报告文件。 - -这里选用的是`spec` + `coverage-istanbul`插件他们将会根据`webpack.test.config`里面配置的 post-loader `istanbul-instrumenter-loader`反向与源代码联系在一起,在执行单元测试的过程中,记录各种方法,变量的调用情况,最后根据`coverageIstanbulReporter`中定义的`reports`类型,生成 html 报告,通用的`lcov.info`覆盖率描述文件,和一个终端输出的报告。 - -7. 那么那个`karma-jawr`起到的是什么作用呢,这是一个为了根据前后端技术选型 解耦的 自己开发的一个 karma 插件,在下面这个 **Decoupled Solution** 一章会讲这里的设计 - -### Decoupled Solution - -我们先来看看在本地开发环境下~~即根据环境分离的相关配置都设置成 env=development,debug=on 之类的参数)~~ - -假设在样例项目中,我们在浏览器里面访问某个 url `xxxx/home`,在经过 SpringMVC 的 viewResolver,mapping 到一个`home.xhtml` 。此时 xhtml 的内容里面,有一些 jawr 相关的 facelets tags,譬如 - -```htmlbars - - - -``` - -这表示他们会根据根据 jawr 的配置文件`jawr.properties` - -``` -jawr.js.bundle.names=i18n, extJs, home, login -jawr.css.bundle.names=extCss -# JAWR Bundle Definitions -jawr.js.bundle.extJs.id=/jsBundles/extJs.js -jawr.js.bundle.extJs.composite=true -jawr.js.bundle.extJs.child.names=\ - extDebug,\ - extProd -## ExtJS Debug Source -jawr.js.bundle.extDebug.debugonly=true -jawr.js.bundle.extDebug.mappings=/js/vendor/ext/ext-base-debug.js, /js/vendor/ext/ext-all-debug-w-comments.js -## ExtJS Prod Source -jawr.js.bundle.extProd.debugnever=true -jawr.js.bundle.extProd.mappings=/js/vendor/ext/ext-base.js, /js/vendor/ext/ext-all.js -## ExtJS CSS Source -jawr.css.bundle.extCss.id=/cssBundles/ext.css -jawr.css.bundle.extCss.mappings=/css/vendor/ext/ext-all.css -## Home Page Application JS Bundles -jawr.js.bundle.home.id=/jsBundles/home.js -jawr.js.bundle.home.composite=true -jawr.js.bundle.home.child.names=homeStore, homeUi, homeImpl -### Home Store -jawr.js.bundle.homeStore.mappings=/js/home/datastore/** -### Home Ui -jawr.js.bundle.homeUi.mappings=/js/home/ui/** -jawr.js.bundle.homeUi.dependencies=homeStore -### Home Impl -jawr.js.bundle.homeImpl.mappings=/js/home/impl/** -jawr.js.bundle.homeImpl.dependencies=homeUi - -### Mappings include jawr bundle example -jawr.js.bundle.login.id=/jsBundles/login.js -jawr.js.bundle.login.mappings=homeUi -``` - -查找并释放转换为对应的 mapping 的若干个` - - - - - - - - - - - - - - - -``` - -大致的可以用一个流程来解释一下这里的情况,与后面解耦部分密切相关的几点: - -- 我们需要知道一个页面所加载的前端资源,具体的模块配置入口是位于 jawr 配置文件的哪一个 bundle -- 这些页面加载的前端资源,在开发环境模式下,如何逐条转换成对应的 script,css 标签 -- 如果开启了 jawr 对应的国际化功能,我们应该如何在测试中生成这些全局的国际化函数 - -为了解耦呢,这个`karma-jawr`的中间件提供了这样一个功能 - -根据`jawr.properties`的位置,参考 jawr Java 的路径解释部分实现,生成了一个中间文件夹 ![generated-indexes folder](./karma-jawr-generated-index.png) 配合 webpack 的`alias`功能,我们只要在单元测试代码里面使用类似这样的语法 - -```javascript -require('@/jsBundles/extJs.js'); -``` - -便能够按需加载页面的资源文件执行,原本在 jsf facelet view 里面使用什么 tag 就知道在单元测试文件加载什么依赖。为了解决中间生成的国际化相关的全局函数,也是参考了 jawr Java 端读取 i18n 相关 properties 的实现,撸了一个输出结果一样的 i18n 的自动添加到每个 index.js 相关的文件列表的最前面,确保他们优先生效,不影响后面 webpack 的解析工作。 - -具体的实现思路,也可以单独作一篇文章,讲解 karma-framework 和 karma-preprocessor 等相关的机制和其作者的一些依赖注入在 node.js 方面的实现。 - -### Benefits - -主要是从开发体验上面来讲,好处是如下 - -- 一旦了解 karma 的基本工作机制,便可以自由搭配各种可搭配的测试框架。(比如`mocha`换`jasmine`,`chai`换`expect.js`,`sinon`..额 sinon 目前还没见到可被替换的有效方案) -- 根据项目浏览器的兼容性,可以修改成各种浏览器及其相关启动 flag -- 基于 webpack 的各种 loader 特性,可以很方便的通过 require 语法引入各种测试家具: (json 格式不必额外的 loader, css 则是基于 style-loader,和 css-loader 的各种配合使用, 纯文本形式可以搭配 file-loader)。不必自己再写各种工具类轮子来实现持久化模拟数据读取的恶心轮子。 - -一些额外的提升开发体验的糖果 - -1. 首先如果大家的 IDE 支持 webpack 的 alias 快速跳转(比如 IDEAU 2017.2 之后的版本) 根据引用部分 -2. 配合 IDEA 的 karma 插件,在编写单元测试的时候,可以动态给 karma.Server 注入不同的参数,配合本身`karma-webpack`内置的 webpack-dev-server 可以做到刷新立刻动态构建单元测试,提升单元测试开发效率。 - -## Example - -### Example Usage - -关于食用方法,可以参考上面提供的样例项目代码的地址。 - -大家可以直接根据[travivs-ci.org](https://travis-ci.org/aquariuslt/karma-jawr-sample/jobs/354402343)上的构建记录来看看实际跑的时候经过了什么步骤。 - -``` - ext - ✓ # check extjs is loaded - ✓ # expect home ui is rendered - css - ✓ # should load css from require syntax success - home - ✓ # test home resources load correctly - ✓ # test home ui render correctly - ✓ # test home ui render correctly 2 - i18n - ✓ # check locale message is loaded normally - ✓ # check locale message with arguments loaded normally - special characters in locale message properties - ✓ # check json array value in locale message properties - ✓ # check if boolean value in locale message properties - ✓ # check if string value contains escape characters - ws - ✓ # ext ajax simple mockup -HeadlessChrome 67.0.3372 (Linux 0.0.0): Executed 12 of 12 SUCCESS (0.109 secs / 0.056 secs) -TOTAL: 12 SUCCESS -16 03 2018 16:33:19.396:DEBUG [reporter.coverage-istanbul]: File [/home/travis/build/aquariuslt/karma-jawr-sample/src/main/webapp/js/home/datastore/home.base.datastore.js] ignored, nothing could be mapped -16 03 2018 16:33:19.397:DEBUG [reporter.coverage-istanbul]: Writing coverage reports: [ 'html', 'lcovonly', 'text-summary' ] -=============================== Coverage summary =============================== -Statements : 100% ( 11/11 ) -Branches : 100% ( 0/0 ) -Functions : 100% ( 3/3 ) -Lines : 100% ( 11/11 ) -================================================================================ -16 03 2018 16:33:19.461:DEBUG [karma]: Run complete, exiting. -16 03 2018 16:33:19.462:DEBUG [launcher]: Disconnecting all browsers -16 03 2018 16:33:19.474:DEBUG [launcher]: Process ChromiumHeadless exited with code 0 -16 03 2018 16:33:19.474:DEBUG [temp-dir]: Cleaning temp dir /tmp/karma-79051576 -16 03 2018 16:33:19.482:DEBUG [launcher]: Finished all browsers -[16:33:19] Finished 'ext:unittest' after 12 s -[16:33:19] Finished 'test' after 12 s -``` - -## Result - -基于这个方案落地并付诸实践整个测试流程之后,引导大家逐渐开始为该项目编写前端部分的单元测试,并且逐渐可以发展到其他使用到类似技术栈和遇到同样痛点的项目组。 - -目前项目前端业务源代码总量大概在 300K 行 经过三个月的单元测试编写,目前从覆盖率上讲,从 0 达到了 11%左右。 - -我们在为一些特别难以测试的案例里面,根据不同典型的错误场景,也做了不同的对应解决方案,有直接安全重构的,有扩展全局测试用例 timeout 时间的,有非安全重构的。都逐渐提醒整个开发团队在编写新代码或者为旧代码扩展的时候,对代码有着更多的精益思考。虽然前端代码的时间方面,并没有真正做到测试先行的最终目标,但是从不能被单元测试到可被单元测试,代码风格和代码质量都朝着正确的方向走去,少走了很多歪路。 - -## Summary - -这次为项目设计的这个单元测试流程,考验了很多方面的开发与设计能力: - -首先必须了解目前项目使用的后端技术栈,推导出当时选型时候的设计背景,再上熟练运用后端技术栈中的前端资源解决方案。在根据公司项目抽离出不相关的技术栈,搭建一个最小化能复现当时技术栈的相关代码结构,也考验分离项目结构的基本功。 - -其次必须对主流的前端单元测试方案有所了解,使用什么框架,结合什么插件,这些框架哪些部分在进行前后端结构解耦的时候需要考虑,如何快速测试方案是否具有可用性。 - -后面还得了解测试背后如何方便能够展现覆盖率,如何能够通过测试流程自动发现代码中存在的问题,还得有足够的持续集成相关经验。 - -串联起来比较考验综合能力,在落地宣讲的时候,为了寻求背书支撑也做了很多资料搜集的功夫。 - -综合起来就是 考验了小部分项目结构分析能力,前端框架和构建工具选型水平,持续集成选型,和在必要的时候造个中间件的轮子的能力。感觉当时要是哪个方面少了哪一点 可能最后都不能得出一个较为可行的方案。算是一个多面打杂之后的综合输出考验吧。 diff --git a/data/posts/2018/04/01/karma-jawr-development-note.md b/data/posts/2018/04/01/karma-jawr-development-note.md deleted file mode 100755 index 0c50af145..000000000 --- a/data/posts/2018/04/01/karma-jawr-development-note.md +++ /dev/null @@ -1,297 +0,0 @@ ---- -title: 'A Karma Plugin: Karma-JAWR Development Note' -id: karma-jawr-development-note -created: 2018-04-01 -updated: 2018-04-01 -categories: - - Blog -tags: - - Karma - - Node - - AngularJS - - JavaScript - - JAWR -cover: ./cover.png ---- - -# 基于 Karma 的非分离式前端单元测试基础方案 - -## Background - -### Why - -上一篇文章**基于 Karma 的非分离式前端单元测试基础方案**描述了在拆分基于 JAWR 的,前后端的方案的时候,无可避免的为中间编写一个插件的背景故事。 - -## Knowledge Base - -在总结开发这个 karma 插件的笔记的时候,最终目的并不是希望读这篇文章的童鞋了解`jawr`这个插件所解决的核心问题,更多的是介绍 karma 和 karma 插件的设计理念,稍微对 karma 这个 test-runner 有一个更好的印象;亦或是在前端单元测试框架选型/亦或是根据实际项目需要,为了使得项目可被测试,无可避免的做出比较多的修改的时候,能够遵循这种插件开发的约定,使得项目测试方面更好的走向工程化。 - -### History: node-di, angular.js and karma - -在介绍整个问题之前,无可避免的先介绍一下**karma**的一些背景。 - -如果曾经接触过 angular.js 相关项目的开发,那就一定需要了解一下 angular.js 的依赖注入机制相关知识。 - -angular.js v1 的依赖注入机制及其实现呢,其实就是来自于`node-di`的实现(后来 DEPRECATED 并迁移到`angular/di.js`,虽然后面 angular v2+也并没有使用这个实现)。而`node-di`,`angular.js v1`,和`karma`中的依赖注入实现的主要作者都是同一位大神: [vojtajina](https://github.com/vojtajina) - -所以我们可以看到在根据获取依赖的时候的一些类似的语法,诸如`$inject`等。 - -所以一旦你看过一些其他的 karma 相关的 framework 的源代码,大概就知道要如何起手了去看了,起码你能够从一些基本的 ioc 设计原则上知道 karma 如何加载相关插件,等等。 - -### Karma Plugin Types - -在 karma 的官方文档的[plugins 页面](https://karma-runner.github.io/2.0/dev/plugins.html),提供了 karma 不同类型的插件及其常见列表。(其中很大部分是 karma 团队自己维护的,有一个官方的参考对象)。 - -这里转贴一部分常见的不同几个类型。 - -#### Frameworks - -- karma-jasmine -- karma-mocha -- karma-requirejs - -> karma frameworks 类型比较杂,功能可能是覆盖所有下面多种情况的一种或者多种 - -#### Reporters - -- karma-junit-reporter -- karma-coverage-istanbul-reporter - -> karma reporters 常见的功能是在 karma 运行完测试流程之后,根据测试过程记录下的各种记录文件,生成覆盖率,测试用例列表等报告的功能。 - -#### Launchers - -- karma-chrome-launcher -- karma-firefox-launcher - -> karma launcher 的功能就是提供给你启动所有位于系统中的浏览器的链接功能。比如出场率相当高的 karma-chrome-launcher 就实现了各个系统的 **Chrome**,**Chromium**,**Chrome Dev**,**Headless Chrome(puppeteer)** 的链接启动功能,通过默认的参数/或者自己穿进去的环境变量 等形式 可以唤起对应版本的浏览器实例来运行脚本。 - -#### Preprocessors - -- karma-webpack -- karma-babel-preprocessor - -> Preprocessors 顾名思义就是预处理器。很有可能你的单元测试代码是使用 ES6+的语法进行编写的,可能需要通过 babel 进行转译,或者根据 webpack 的配置 + 不同的 loader 进行转译,才能在运行中的浏览器示例上正常被解析执行。所以在一些 karma config options 里面能够看到类似下面的预处理流程: -> -> ``` -> preprocessors: { -> '/**/*.spec.js': ['webpack', 'sourcemap'] -> }, -> ``` - -## Development Note - -### Concert & Situations - -在编写`karma-jawr`插件之前,我的设想需求,从编写单元测试代码的角度反向推导开之后,是这样一个流程: - -**jawr.properties**(片段) - -``` -# JAWR Bundle Definitions -jawr.js.bundle.extJs.id=/jsBundles/extJs.js -jawr.js.bundle.extJs.composite=true -jawr.js.bundle.extJs.child.names=\ - extDebug,\ - extProd -## ExtJS Debug Source -jawr.js.bundle.extDebug.debugonly=true -jawr.js.bundle.extDebug.mappings=/js/vendor/ext/ext-base-debug.js, /js/vendor/ext/ext-all-debug-w-comments.js -## Home Page Application JS Bundles -jawr.js.bundle.home.id=/jsBundles/home.js -jawr.js.bundle.home.composite=true -jawr.js.bundle.home.child.names=homeStore, homeUi, homeImpl -``` - -**xxx.xhtml** - -```htmlbars - - - - Karma Jawr Sample Page - - - - - - - - - - - - -``` - -**xxx.spec.js**(片段) - -``` -require('@/jsBundles/extJs.js'); -require('@/jsBundles/home.js'); - -describe('ext', function() { - it('# check extjs is loaded', function() { - var expectExtVersion = '3.3.1'; - expect(Ext.version).to.eq(expectExtVersion); - }); - - it('# expect home ui is rendered', function() { - expect(Ext.getCmp('app.home')).not.to.eq(undefined); - }); -}); - -``` - -在进行测试的流程里面 - -首先单元测试文件经过 preprocessor 的处理,能够把`require('@/jsBundles/extJs.js')` 正确根据`jawr.properties`的配置内容加载 extjs ~~这里且不说 extjs 本身的代码是否支持 umd 形式的 export~~ 接着在浏览器执行的时候的 html 引入的时候,已经是能够被浏览器正确识别的,转译后的代码。 - -所以从流程上,结合已有的插件,列出了从后到前的顺序点: - -- 编写 BDD 形式的单元测试文件,通过 require/import + jawr bundle id 导入对应的业务代码依赖 -- 经过 webpack 转译成可被浏览器识别的代码 -- 在 karma 启动时的 client html 中通过 mocha 执行所有测试用例 - -### Design - -那么主要的问题就在于,如何使得测试文件中的 `require('@/jsBundles/home.js');` 能够正确根据 jawr 的配置 反向引导对应的源代码呢? - -除此之外,还有一些 jawr+spring 国际化本身的一些实现,如何根据对应的国际化文件,生成那些全局,执行后返回对应语言版本国际化变量呢? - -对于第一步,目前设计的解决方案是如下: - -**第一步:** 给`karma.conf.js` 提供一个额外的 options field: jawr 主要是提供一些 jawr 相关配置文件的绝对路径 - -目前我给他设置了一个 type-definition - -```typescript -declare interface JawrOptions { - configLocation: string; - webappLocation: string; - targetLocation: string; - - // optional locale config location for jawr i18n generator - localeConfigLocation?: string; -} -``` - -实际上的使用大概是这样: karma.confg.js - -``` -module.exports = function(config){ - config.set({ - /*....*/ - jawr: { - configLocation: pathUtil.resolve('src/main/resources/jawr/') + 'jawr.properties', - webappLocation: pathUtil.resolve('src/main/webapp'), - targetLocation: pathUtil.resolve('src/test/js/build'), - localeConfigLocation: pathUtil.resolve('src/main/resources') - }, - }) -} -``` - -里面需要知道的是: - -- jawr.properties 的路径 -- webapp 文件夹的路径(目的是为了定位 js,css 业务源代码的路径) -- 生成的中间临时文件夹的路径: 根据 jawr 配置文件生成的实际路径的处于`*.spec.js`和源代码中间的临时 link 文件夹 -- 如果启用了可选的国际化模块,则需要填写国际化源代码文件的路径 - -**第二步** 根据 jawr 的 Java 源代码,使用 js 实现以下功能 - -- 解析 jawr 配置文件,根据每个 bundle id 来查找到对应的源代码文件 -- 解析 i18n 配置文件,生成对应的全局变量行数 - -**第三步** 通过 karma 结合 webpack 做预处理器,结合`mocha`,`chai`,`sinon` 做基本的测试。 - -### Development Roadmap - -#### Local Testing - -如果没有了解 npm 加载模块机制和 karma 所使用的 di 约定的时候,可能本地测试必须依赖已经发布的 npm package. - -正确的做法应该是: - -在**karma.conf.js** 的 plugins 显式声明一个本地的引用该引用等同`package.json`里面`main`的指向 - -``` -plugins: [ - 'karma-chrome-launcher', - 'karma-chai', - 'karma-mocha', - 'karma-spec-reporter', - 'karma-coverage', - 'karma-coverage-istanbul-reporter', - 'karma-sourcemap-loader', - 'karma-sinon', - 'karma-webpack', - localJawrFramework // ==> var localJawrFramework = require('../../lib'); - ], -``` - -**package.json** - -```json -{ - "name": "karma-jawr", - "main": "lib/index.js" -} -``` - -**lib/index.js** - -```javascript -var frameworkLogger = require('./logger'); - -var jawrHandler = require('./jawr.handler'); - -/** - * @param {Array} files: file pattern - * @param {JawrOptions} jawrOptions: jawrOptions - * @param {Object} logger: karma logger - * */ -var framework = function(files, jawrOptions, logger) { - frameworkLogger.initLogger(logger); - jawrHandler.handle(jawrOptions); -}; - -framework.$inject = ['config.files', 'config.jawr', 'logger']; -module.exports = { 'framework:jawr': ['factory', framework] }; -``` - -#### Integrate with CI - -目前只有测试部分与`travis-ci`和`circleci`集成了。 - -[circleci](https://circleci.com/gh/aquariuslt/karma-jawr) [travis-ci](https://travis-ci.org/aquariuslt/karma-jawr) - -#### Pre-Release and Testing - -为了解决其他在实际应用中遇到的问题,包括但不限于各种 - -- jawr 配置的胡乱使用 -- node.js 的 properties 解释实现并没有覆盖 properties 事实标准的所有情况 - -等...我是自己维护了 issue 列表并且把每次修改的测试用例都加到本身的单元测试流程中 - -目前详见[issues](https://github.com/aquariuslt/karma-jawr/issues) - -~~有一个目前因为技术原因暂时被我 标记了 wont fix~~ - -## References - -[项目源代码 Repo](https://github.com/aquariuslt/karma-jawr) - -[Karma 作者的设计论文](https://github.com/karma-runner/karma/raw/master/thesis.pdf) - -[Karma 测试框架的前世今生 - 淘宝 TED | Karma 作者论文译文](http://taobaofed.org/blog/2016/01/08/karma-origin/) diff --git a/data/posts/2019/07/07/github-actions-overview-and-practice.md b/data/posts/2019/07/07/github-actions-overview-and-practice.md deleted file mode 100644 index 816848921..000000000 --- a/data/posts/2019/07/07/github-actions-overview-and-practice.md +++ /dev/null @@ -1,415 +0,0 @@ ---- -title: 'Github Actions: Overview and Practice' -id: github-actions-overview-and-practice -created: 2019-07-07 -updated: 2019-07-07 -categories: - - Blog -tags: - - NPM - - CI - - Github - - TravisCI - - Actions - - Docker -cover: ./github-actions.png ---- - -# Github Actions: Overview and Practice - -## Background - -Github Actions 自从开放 beta 以来,感觉一直没有掀起什么大浪。但是他的在 CI 平台插件方面的概念其实是符合一直以来的方向的: `Docker Image as Plugins` - -我在二月份开始已经申请到了 Github Actions Beta 的体验许可,为了体验 Github Actions 的功能,以及跟目前其他开源项目所用的持续集成平台进行简单的对比,下面将以一个 通过 Gtihub Action 发布 npm package 的过程作为初步体验,讲讲我对 Github Actions 在 CI 方面的认识。 - -## Before Reading - -再开始阅读之前,准备了一些截图,简要的从表现上了解下 Github Actions 的一些功能 - -- What is Github Actions -- How workflow configuration look like -- How workflow runtime look like - -### What is Github Actions - -Github 官方介绍,简要的将就是 Github 官方提供的 CI 平台,其最大的特点便是插件化形式来进行 CI Stages 的组装。 - -> GitHub Actions allow you to implement custom logic without having to create an app to perform the task you need. You can combine GitHub Actions to create workflows using an action defined in your repository, a public repository on GitHub, or a published Docker container image. GitHub Actions are customizable and can use the GitHub API and any publicly available third-party APIs to interact with a repository. For example, an action can publish npm modules, send SMS alerts when urgent issues are created, or deploy production ready code. You can discover, create, and share your GitHub Actions with the GitHub community. - -先是官方带头在平台基础上创建了一些通用性比较高的、对标其他同类平台的一些主打功能,以一个封装好的 `Actions` 概念暴露出来,然后每个 Repository 便可使用这些官方预设的 `Actions` 来组装自己的 `Workflow` 。 - -官方 Actions 集合 [https://github.com/actions](https://github.com/actions) 里找到官方维护的 actions. 常见的一些 actions 有: - -- npm -- docker -- shell - -![](./github-official-actions-repo.png) - -当然也有很多非官方维护的 Actions,可以通过发布到 Github Marketplace 被查找到,也可以把 Actions 构建成 `docker image` + `entrypoints.sh` 的形式, 在项目 Workflow 的配置文件中以 docker image 路径进行引用。 - -### How configuration look like - -如果有使用过 Travis-CI, CircleCI, Gitlab-CI 的经验,大概可以想起,对应的 CI 配置文件,大都通过 yml 作为基础语法,来描述整个 CI Pipeline 涉及的几个步骤。 - -在 Github Web 界面中,对一个 Repository 的 Workflow 进行可视化的编辑时,如下图: - -![](./workflow-editor-example.png) - -实际上他背后也是一个配置文件,用这个配置文件来描述 workflow 的触发条件,步骤,以及每个步骤执行时的参数... 只不过 Github 官方提供了基于这个配置文件的在线可视化编辑,使得编辑过程比较容易直观。 - -关于配置文件的语法、编写形式,下面会以样例提及。 - -### How workflow runtime look like - -在 Github 界面上查看单个项目的 Workflow 下图这样子: - -![](./workflow-example.png) - -这个 Workflow 比较直观的表示了一个基本的 `构建-发布` 流程: - -1. 在 Github 收到 `push` 事件的时候,触发该 workflow -2. 首先同时执行 `dockerlint` 和 `shelllint` 步骤 -3. 以上两步都通过后,根据项目的 dockerfile 构建 docker 镜像 -4. 对 docker 镜像进行多次打 Tag (latest, timestamp),同时判断是否需要进行发布(判断当前分支是否 master 分支) -5. 使用 secrets 进行 docker login 步骤 -6. 调用 docker publish 对刚才打过 tag 的镜像 发布到官方的 docker registry 中去 - -这样一看,对 Github Actions 和 Workflow 的界面表现形式有了一个模糊的了解。基于其他 CI 平台的使用经验,大概可以想象自己目前的 CI Pipeline 转移到 Github Actions 上,表现出来会是怎样一个形式。 - -## Example Project - -现在介绍用来做样例的项目。 - -这是一个 npm package,已经发布到 npm 上。 - -npm 地址: [https://www.npmjs.com/package/jest-properties-loader](https://www.npmjs.com/package/jest-properties-loader) - -github 地址: [https://github.com/aquariuslt/jest-properties-loader](https://github.com/aquariuslt/jest-properties-loader) - -现在我们来了解下: - -- 项目目录结构 -- 本地发布前、发布时,一般要做什么步骤 - -接着在下一章介绍,如果通过 Travis-CI 或者 CircleCI 进行 持续构建和自动触发发布,一般的思路和实现方式。 - -最后再简介如何通过 Github Actions 实现同样的功能。 - -### Project Overview - -这个 npm package 的作用,主要是为 jest 提供一个 `.properties` 文件的 json 格式转换。 - -不过其实我们不需要了解他的作用,仅仅把他当做一个普通的 public npm package 吧。 - -项目的文件结构如下: - -``` -├── LICENSE -├── README.md -├── lib -│   └── index.js -├── package.json -├── test -│   ├── __fixtures__ -│   │   └── sample.properties -│   └── loader.test.js -└── yarn.lock -``` - -### Manual Release Flow - -**Running Tests** - -当我们准备发布一个新版本时,先通过测试 - -```bash -yarn test -``` - -**Build Distribution (Optional)** - -如果项目需要构建,则需要有类似 `build` 的一步。 - -不过这个项目,我直接以可直接运行的 commonjs 的语法进行编写,所以省去了这一步。 - -Update Version - -发布前,我们需要更新仓库版本号,遵循 `semantic versioning` 的原则,进行代码版本号更新。 - -```bash -yarn version -``` - -Login NPM - -现在一切就绪,我们准备进行 npm package 发布。 - -一种方式是通过 `npm login` 命令 进行发布, 然后输入 npm 用户名和密码 - -```bash -npm login -``` - -当然也有另外一种 npm 推荐的方式,那边是使用使用 NPM 的 [Auth Token](https://docs.npmjs.com/about-authentication-tokens) 进行验证。 - -参考链接中的步骤创建好一个 Auth Token 后,将其释放在 `.npmrc` 文件内 - -```bash -export NPM_TOKEN="00000000-0000-0000-0000-000000000000" -echo "//registry.npmjs.org/:_authToken=${NPM_TOKEN}" >> .npmrc -``` - -参考文章: [https://blog.npmjs.org/post/118393368555/deploying-with-npm-private-modules](https://blog.npmjs.org/post/118393368555/deploying-with-npm-private-modules) - -**NPM Publish** - -```bash -npm publish -``` - -至此,我们便手动的将一个 npm package 的新版本发布到 npm registry 了。 - -## Existing CI CD Flow - -现在我们看看如果是 Travis CI 进行自动构建 npm 包发布是怎么做呢 - -Travis CI 的 Example 样例如下: - -```yaml -language: node_js -node_js: - - '10' - -script: npm test - -# Submit Coverage Status to coveralls.io -after_script: - - cat ./test/coverage/lcov.info | ./node_modules/.bin/coveralls - -deploy: - provider: npm - email: $NPM_AUTH_EMAIL - api_key: $NPM_AUTH_TOKEN - on: - tags: true - branch: release -``` - -具体点讲,这份配置文件关于自动发布 npm package 的过程便是 - -1. 触发条件: 当触发 travis-ci 的 webhooks 是 `tags` 而且 分支名为 `release` 时,触发此 deploy 操作。 -2. 使用 travis-ci 提供的 [https://docs.travis-ci.com/user/deployment/npm/](https://docs.travis-ci.com/user/deployment/npm/) `npm release provider` 配合 github secret `$NPM_AUTH_TOKEN` 进行发布。( token 为上一部分介绍的 npm token) - -配置过程相当简洁,唯一一点是 npm publish 过程并不透明,我们无法知道 travis ci 的具体实现。 - -## Introducing Github Actions Solution - -现在开始介绍我如何使用 Github Actions 构建一个 workflow,来进行 npm package 的发布。 - -下面介绍理想中的思路,然后拆分开来讲讲我是如何实现的。 - -### Release flow design based on Github Actions - -在理想的设计里,基本遵循两部分操作: - -1. 每次提交代码 (`push event`),触发基本的 build & test pipeline -2. 每次准备发布新版本的时候,通过 `yarn version` 给即将发布的 npm package 版本迭代版本号,此操作不仅会更新 `package.json` 中的版本,还会给这个更新版本的 commit 自动打上一个 tag -3. 需要发布的时候,在 Repository 的 Github 主页中,从新版本中的 tag 创建一个新的 release。这个 `release event` 会触发 release pipeline,执行发布新版本的操作。 - -### Create a workflow file - -按照 Github Actions 的约定,我们需要在 Repository 根目录的 `.github` 目录下,创建一个 `main.workflow` 文件。Github 将会以 `.github/main.workflow` 作为 workflow 的入口配置文件。 - -`.workflow` 文件是 Github Workflow 的文件名后缀,其语法为 `hcl` 。 - -> 2019 年 8 月初, github actions 官方宣布 actions 配置语法会从 HCL 改变为 yaml, 因此这篇文章的 HCL 部分其实在 9 月 30 日之后便失效。后面会更新一篇文章,简述新的 yaml 语法 See [Migrating Github Actions from HCL syntax to YAML syntax](https://help.github.com/en/articles/migrating-github-actions-from-hcl-syntax-to-yaml-syntax) - -hcl 全称 **HashiCorp Configuration Language**,其具体的语法规则比较容易明白,实际上可以快速的以 `json`, `yml` 的概念去理解他。 - -Tips: 如果平时使用 IntelliJ IDEA 进行开发,那么可以通过给 `.workflow` 添加语法支持,将其识别为 `HCL` 语法,即可获得高亮与格式化 - -![](./workflow-syntax-highlight.png) - -为了实现基本操作中的第一部分: 每次提交代码时,触发基本的 build & test pipeline - -我在 `main.workflow` 中声明 `build & test` 的 pipeline。 - -这里为了方便,我使用了 `workflow/ci` 作为他的简称。 - -```hcl -workflow "workflow/ci" { - on = "push" - resolves = ["test"] -} - -action "install" { - uses = "actions/npm@master" - runs = "yarn" -} - -action "test" { - uses = "actions/npm@master" - needs = ["install"] - runs = "yarn" - args = "test" -} -``` - -以上代码块,声明了一个 `workflow` 和两个 `actions` 。 - -以 workflow 作为入口,actions 里的 `resolves` 和 `needs` 作为依赖指向,可以生成有向无环图来描述这个构建流程。 - -其实际可以理解为: - -1. 当触发了 github repository 的 push event 的时候,执行该名为 `workflow/ci` 的 workflow -2. `workflow/ci` 实际上只要执行名为 `test` 的 action,所以会从整个配置文件里,查找 action = `test` 的 action 并执行。 -3. 当准备执行 `test` actions 之前,检查他的 `needs` 部分,哦原来执行 `test` 之前,还要先执行一个 `install` 的 action。那么现在需要按顺序执行 `install` `test` 这两个 actions - -我们开始看 `install`. - -`install` 的内容,便是执行 `yarn` 这个命令,来安装依赖,那么这个 `uses` 何解? - -`uses` 实际指向了一个 github actions 的名字,现在名为 `actions/npm@master` ,那么他会从 `[https://github.com/actions/npm](https://github.com/actions/npm)` 的 master 分支下,查找 `Dockerfile` ,构建出一个 docker 镜像,并且使用该 docker 镜像,执行 `runs` 和 `args` 下的命令操作,对工作目录(workspace)进行更变。 - -说白了,这就是在一个有 `npm` 环境下执行 `yarn` 的一个操作,来安装依赖。 - -至于如何理解 Github Actions 本身的基础运作,可能需要单独开启一篇进行讲解。 - -现在再看 `test` - -说白了,就是在已安装了 node 依赖的工作目录下,执行 `yarn test` 命令 - -所以每当触发 Github 的 push event 之后,我们可以在 Github Actions 流程里看到 `workflow/ci` 的执行。 - -![](./workflow-ci-overview.png) - -### Add Publish workflow - -现在我们来添加一个名为 `release` 的 workflow。在 `main.workflow` 下追加如下内容: - -```hcl -workflow "release" { - on = "release" - resolves = ["npm:release"] -} - -action "filter:release" { - uses = "actions/bin/filter@master" - args = "action created*" -} - - -action "npm:release" { - needs = "filter:release" - uses = "actions/npm@master" - secrets = ["NPM_AUTH_TOKEN"] - args = "publish" -} -``` - -里面的内容大概如下: - -1. 当收到 github 的 release event 时,触发此操作。(通常是在 Repository 主页的 Release 一栏,创建或更新 新版本 release 时触发的 event) -2. 在执行真正的 npm publish 操作之前,先使用一个名为 `filter: release` 的 action,判断是否真正的需要执行 publish。 -3. 我们使用 github 官方的提供的 `actions/npm` , npm 环境,传入一个 `NPM_AUTH_TOKEN` 的环境变量,然后执行 `npm publish` 命令进行发布。 - -现在,我们在 github release 页面创建一个新 release 时 - -![](./create-release.png) - -将会触发 `npm:release` workflow: - -![](./trigger-release-workflow.png) - -接着可以看到如下 detail log, 详见 [https://github.com/aquariuslt/jest-properties-loader/runs/163050939](https://github.com/aquariuslt/jest-properties-loader/runs/163050939) - -``` -Successfully built fbc21be41b97 -Successfully tagged gcr.io/gct-12-lnx0cl9uvvzo-abntpgke1w/484654f7adf6911ca799c01d40b80af4456b46c3418f33934c643ccc7f245f38/8a5edab282632443219e051e4ade2d1d5bbc671c781051bf1437897cbdfea0f1:fc613b4dfd6736a7bd268c8a0e74ed0d1c04a959f59dd74ef2874983fd443fc9 -Already have image (with digest): gcr.io/github-actions-images/action-runner:latest -npm notice -npm notice package: jest-properties-loader@1.0.3 -npm notice === Tarball Contents === -npm notice 764B package.json -npm notice 1.1kB LICENSE -npm notice 325B README.md -npm notice 531B .github/main.workflow -npm notice 200B lib/index.js -npm notice 15B test/__fixtures__/sample.properties -npm notice 547B test/loader.test.js -npm notice === Tarball Details === -npm notice name: jest-properties-loader -npm notice version: 1.0.3 -npm notice package size: 2.0 kB -npm notice unpacked size: 3.4 kB -npm notice shasum: dc929c7b12c0bf8e57f100f8cfb95be26fe9464e -npm notice integrity: sha512-dhmq60thri/8x[...]HG4ITr5zZ8fIw== -npm notice total files: 7 -+ jest-properties-loader@1.0.3 -npm notice - -### SUCCEEDED npm:release 14:18:26Z (16.054s) -``` - -至此,新版本的 `jest-properties-loader` 已发发布到 npm 中。 - -**为什么第二步中,我需要做这样一个 filter 呢?** - -答: 因为按照 Github 的 event 列表,在 web 页面上创建一个新 release,会同时触发两个 release 事件: `release: created` 和 `release: published` 。当这两个 release 事件都触发这个 workflow 的时候,一旦其中一个 `npm publish` 命令成功发布新版本的包之后, 另外一个在执行发布操作时,就会返回一个 **版本已存在** 的错误,导致整个 commit status 是失败的。 - -所以我通过 github 官方的 `bin/filter` action 来进行触发事件的过滤,保证同一次界面上的 `release` 操作只会执行一次 `npm publish` . - -**为什么在使用 `actions/npm` 时,传递一个 \$NPM_AUTH_TOKEN secret 变量即可进行发布?** - -看上去跟之前提及的 "创建好一个 Auth Token 后,将其释放在 `.npmrc` 文件内" 似乎不太一样? - -这是因为 `actions/npm` 的 entrypoint.sh 文件,就是做了这样一部环境变量的判断。约定为 `$NPM_AUTH_TOKEN` 时,便将其写入到 `.npmrc` 文件中。 - -[entrypoints.sh](http://entrypoints.sh) 文件大致如下,详见 [https://github.com/actions/npm/blob/master/entrypoint.sh](https://github.com/actions/npm/blob/master/entrypoint.sh) - -```bash -#!/bin/sh - -set -e - -if [ -n "$NPM_AUTH_TOKEN" ]; then - # Respect NPM_CONFIG_USERCONFIG if it is provided, default to $HOME/.npmrc - NPM_CONFIG_USERCONFIG="${NPM_CONFIG_USERCONFIG-"$HOME/.npmrc"}" - NPM_REGISTRY_URL="${NPM_REGISTRY_URL-registry.npmjs.org}" - NPM_STRICT_SSL="${NPM_STRICT_SSL-true}" - NPM_REGISTRY_SCHEME="https" - if ! $NPM_STRICT_SSL - then - NPM_REGISTRY_SCHEME="http" - fi - - # Allow registry.npmjs.org to be overridden with an environment variable - printf "//%s/:_authToken=%s\\nregistry=%s\\nstrict-ssl=%s" "$NPM_REGISTRY_URL" "$NPM_AUTH_TOKEN" "${NPM_REGISTRY_SCHEME}://$NPM_REGISTRY_URL" "${NPM_STRICT_SSL}" > "$NPM_CONFIG_USERCONFIG" - - chmod 0600 "$NPM_CONFIG_USERCONFIG" -fi - -sh -c "npm $*" -``` - -## Finally - -最后总结,通过本次 Github Actions 的实践,大致了解了: - -- Github Actions 的基本概念 -- Workflow 的基本组成,配置概念及其运行表象 -- Travis CI Pipeline 对应的 Github Actions -- 如何通过 Github Actions 自动发布 NPM Package - -## References - -- Github Actions Official Document - [https://developer.github.com/actions/](https://developer.github.com/actions/) -- Github Actions Marketplace - [https://github.com/marketplace](https://github.com/marketplace) -- Creating a workflow with github actions - [https://help.github.com/en/articles/creating-a-workflow-with-github-actions](https://help.github.com/en/articles/creating-a-workflow-with-github-actions) - -## Further Reading - -- Automate your NPM publish with GitHub Actions - [https://medium.com/faun/automate-your-npm-publish-with-github-actions-dfe8059645dd](https://medium.com/faun/automate-your-npm-publish-with-github-actions-dfe8059645dd) -- HCL (HashiCorp Configuration Language) - [https://github.com/hashicorp/hcl](https://github.com/hashicorp/hcl) diff --git a/data/posts/2019/07/23/travel-in-beijing.md b/data/posts/2019/07/23/travel-in-beijing.md deleted file mode 100644 index f06d3f4d8..000000000 --- a/data/posts/2019/07/23/travel-in-beijing.md +++ /dev/null @@ -1,178 +0,0 @@ ---- -title: 'Travel in Beijing' -id: travel-in-beijing -created: 2019-07-23 -updated: 2019-07-23 -categories: - - Blog -tags: - - Trips -cover: ./palace-museum-01.png ---- - -# Travel in Beijing - -毕业第四年后,终于有机会第一次旅行,目的地选择了北京。 - -本次北京行采用的是语文课本童年回忆随缘踩点法,在打卡景点的同时,顺便会一下闻名已久的北京帮朋友。 - -算上两天年假,一共有四天假期,周四晚上飞机,周一晚上回深圳。 - -于是按照时间线来一发流水账,在总结下本次北京行的直观感受。 - -## Day0: 凌晨四点中关村之旅 - -周四晚上 12 点半从机场出来,直奔北京分公司放下行李,随后便开始了 MSRA 参观之旅。在马老师的带领下,深夜潜入微软北京分部。 - -![](./msra.png) - -## Day1: 国家奥体中心 - -周五因为见识到凌晨四点的北京,所以八点才开始休息,醒来时已经是下午三点。 - -由于晚上约了北京帮在中关村便宜坊聚餐,于是马上打车到附近的国家奥体中心。 - -奥体中心主要是鸟巢、水立方和附近的一些大楼。 - -![](./bird-nest.png) - -偶遇杰森斯坦森 x 吴亦凡演唱会 - -![](./bird-nest-show.png) - -没有等待到晚上水立方点亮的特效,在白天看上去比较平淡 - -![](./water-m3.png) - -交通指数: ⭐️⭐️⭐️⭐️⭐️ - -印象指数: ⭐️⭐️⭐️ - -## Day2: 故宫博物馆线 - -> 在北京的中心,有一座城中之城,这就是紫禁城。现在人们叫它故宫,也叫故宫博物院。这是明清两代的皇宫,是我国现存的最大最完整的古代宫殿建筑群,有五百多年历史了。紫禁城的城墙十米多高,有四座城门:南面午门,北面神武门,东西面东华门、西华门。宫城呈长方形,占地 72 万平方米,有大小宫殿七十多座、房屋九千多间。城墙外是五十多米宽的护城河。城墙的四角上,各有一座玲珑奇巧的角楼。故宫建筑群规模宏大壮丽,建筑精美,布局统一,集中体现了我国古代建筑艺术的独特风格。 - -我基本已经忘记语文课本上曾经关于故宫博物馆的描述了,满脑子都是 TVB 古装剧的情景。 - -周六起床马上直达故宫博物馆。 - -由于打车不能直接到达天安门正门,所以我是从西南门进入故宫,沿着护城河走了一公里左右,便来到了午门~~斩首~~。 - -![](./palace-museum-01.png) - -从午门开始检票进入故宫博物馆范围,便开始了随缘遍历游览。 - -![](./palace-museum-02.png) - -御花园。 - -![](./palace-museum-03.png) - -![](./palace-museum-04.png) - -遍历游览到最北的神武门,可以辗转上城墙上走回东门。这张照片是在城墙上纵览角度拍照的。 - -![](./palace-museum-05.png) - -另值得一提的是,中间有一件紫禁城历史 VR 体验室,50RMB 体验 8 分钟紫禁城建造历史纪录片。 - -是头戴式 VR+动感电椅,纪录片大部分时间以明朝皇帝视角讲述故宫建造历史。配合电椅的抖动和故事片的剧情,仿佛有一种 ~~重生之我是朱棣~~ 的感觉。值得体验。 - -从神武门上城墙,然后从东门离开故宫,继续往南边走,就是国家博物馆。又经过一轮无尽的安检排队之后,进入国家博物馆。 - -国家博物馆分很多层,其中每层的不同位置在一年的不同周期开设不同的主题展,我随便选了几个主题展逛逛: - -一层主题大厅是丹心铸魂 - 雕塑艺术展其他展厅则是古代钱币、佛造像、亚洲文明、非洲古雕像展。 - -值得一说的是,本期从西安借来了一批兵马俑 ~~直接省了西安行?~~ - -交通指数: ⭐️⭐️⭐️ - -印象指数: ⭐️⭐️⭐️⭐️⭐️ - -## Day3: 烈日攀爬长城 - -![](./great-wall-01.png) - -由于周六故宫线走得是在累,原定早上提前出门乘坐轨道交通前往八达岭长城,结果惯性起晚后,再打车到黄土店火车站时已经错过了十分钟前一班,在车站浪费了进一个小时。 - -经过两小时折腾后,在八达岭火车站下了车,还需要走个一两公里,才能到达前往登城入口的接驳车上车点。 - -接下来便是攀城墙的艰苦时光: 在暑期出行攀登长城,最大的问题可能并不是烈日当空,而是众多的游客互相拥挤,使得长城一路上寸步难行。路上基本是在人海中挪动,偶尔被撑开的太阳伞戳中,偶尔被踩中,反正这点体验极差。 - -![](./great-wall-02.png) - -之前在火车上听到导游说八达岭长城分成南北两部分,其中北边有 8 个烽火台,第 7-8 个烽火台相当陡峭,登上之后便是好汉坡。 (而且有缆车直接下去,我马上毫不犹豫选择了北长城)。 - -在人海中边蠕动边拍照,终于来到好汉坡,赶紧打卡拍了 N 张全景,其中自我感觉有几张特别有 feel,选取了其中一张呈上。 - -![](./great-wall-03.png) - -于是我便在七月北京最热的天里登上了八达岭长城第八层,不知道是不是因为太累,我终于拍到了记忆中课本上长城的样子。 - -交通指数: ⭐️ - -印象指数: ⭐️⭐️⭐️⭐️ - -## Day4: 清北潜入作战 - -周一由于是晚上的航班,按照原定计划是午饭后清华北大潜入大作战,顺便到圆明园遗址一游,然后回到中关村公司北京分部取回行李,前往机场回深圳。 - -但是又一次错误的估计了地图上的距离。 - -从中关村银科大厦附近出发,前往北京大学西门(就是历史最悠久的西门),出来时已经接近 3 点半。接着发现清华周一不开放参观,贵校观光就此错过。 - -![](./pku-untitled-lake.png) - -上图拍摄于北大未名湖 - -后来从圆明园南门购票进入,发现步行到圆明园遗址(西北区域)相当久,于是圆明园最值得去的两个区域(西北区的遗址,东南区的清华)与我离奇的错过 - -交通指数: ⭐️ 一星给随缘出行导致没有打成核心景点打卡的我 - -印象指数: ⭐️⭐️⭐️ - -## 北京印象 - -北京之旅没有衣、只有食住行印象。 - -### 食 - -可能还没尝过很多纯正的北京菜式,但是北京烤鸭倒是已经尝到滋味。 - -总体的说印象就是不辣,但还没听到有强烈推荐不吃后悔的东西。 - -### 行 - -作为首都,给我的印象居然是路面不太堵,在北京的几天中每天打车都没有遇到堵车的情况。也许是没有乘坐过传说中的北京一号线地铁,没体验过地铁口被排队人龙吓哭的情况,这点非常惊讶。 - -北京的景点安检非常多,基本每个地方都会在安检上消耗不少时光,传说中的便衣比行人多确实是有可能的。 - -短短几天的打车出行里,北京的滴滴司机给我印象特别好,讲话特别客气。 - -### 住 - -北京行之前定好机票的时候,在中关村附近已经没有低价的旅店,幸好与博哥、谢总同住的朋友周末不在北京回老家,得以省了上千酒店费用,顺带见识到北京租房的惨淡情况。 - -就短短几天看来,京城的建筑印象,大都是那种古装影城与现代建筑交错的风格。在一条街道上,可以同时看到几个不同时期的建筑风格: - -- 明清古城威严的紫禁城宫廷风格 -- 近现代民国时期淳朴的四合院 -- 九十年代企业职工安置房 -- 现代化的国际都市高楼大厦 - -给我印象最深刻的是,北京为了保留或者迎合这种极具东方特色的建筑文化,大量的采用了宫廷红的配色,并且很多这些古色古香的建筑中,在搭配电子科技譬如摄像头,灯饰等电子化的元素时,都很有默契的给他们安排上伪装色调,仿佛看上去更好的与这种宫廷风结合更加紧密。 - -北京的地区名字很有京味儿。打开地图周围总是什么里,什么屯,什么旗,什么坟,什么观。 - -我借居的地方是海淀区的一个老校区,叫做**什么什么里**。这小区大概年龄跟我老家房子差不多,九十年代的最高七层楼梯房,但说七层也不是准确的七层,因为我第一次知道北京的老小区,有大量的地下室。不是那种负一层的停车场,而是想象中的暗无天日,通风极差的负一层,这个楼层一般房租最便宜,是初来京城谋生,囊中羞涩的兜底之选。 - -在这种**什么什么里**的小区里,随处可以听到充满京味儿的地道的北京话。 - -> 好咧,得嘞,马上就给您送来。 - -## Impression - -最后总结下北京之旅的印象 - -当之无愧的文化中心。 diff --git a/data/posts/2019/08/18/github-actions-new-yaml-syntax.md b/data/posts/2019/08/18/github-actions-new-yaml-syntax.md deleted file mode 100644 index 95e464c7c..000000000 --- a/data/posts/2019/08/18/github-actions-new-yaml-syntax.md +++ /dev/null @@ -1,185 +0,0 @@ ---- -title: 'Github Actions: New YAML Syntax' -id: github-actions-new-yaml-syntax -created: 2019-08-18 -updated: 2019-08-18 -categories: - - Blog -tags: - - NPM - - CI - - Github - - TravisCI - - Actions - - Docker -cover: ./migrating-github-actions.png ---- - -# Github Actions: New YAML Syntax - -## Background - -在之前一篇博文里刚介绍完 Github Actions 配置的`HCL`语法不久,Github 官方就标记为 **deprecated** 了。原因是社区声音推崇他们使用新的 YAML 语法,这类的语法配置与现有的其他 CI 平台相对更加接近,更加容易举一反三写出合理的配置。 - -> The documentation at https://developer.github.com/actions and support for the HCL syntax in GitHub Actions will be deprecated on September 30, 2019. Documentation for the new limited public beta using the YAML syntax is available on https://help.github.com. See "Automating your workflow with GitHub Actions" for documentation using the YAML syntax. - -自此之后,官方的 Github Actions Marketplace 也多了更多的官方 Actions (actions 官方 org 原本置顶的几个 actions 源代码也都换成了常见的项目集成 samples) - -话不多说,就着本次 Github Actions Syntax Migration,还是以上次的 `jest-properties-loader` 为例子,我将它用 typescript 进行重写,并且把 workflow 以同样的思路,迁移到 `YAML` 语法版本。具体的语法不阐述了,基本是所谓的 **所见即所得** 理解模式。 - -## Comparison - -既然是官方提倡的 Migration,Github 提供了很多帮助文档,来帮助你进行迁移。 - -最后的 [References](#references) 部分,会给出一些官方提及的参考链接,但在这里更想直观的提及的是这次 Migration 前后的几个重要对比。 - -- 在 Github Repo 的设置界面,原本的图形化修改 workflow 文件已经失效,直接变成一个 YAML 文件的在线编辑模式 -- 默认的 workflow 默认路径,从 `.github/*.workflow` 变成了 `.github/workflows/*.yml` -- 对应每一个 Step 的构建日志,都添加了基础的高亮功能,在也不是默认的 `stdout | tee` 的形式了 -- 给人的感觉是启动速度超快,以前触发一次 Actions 从对应的构建环境容器拉取、启动,都花了不少时间,现在基本对标其他 CI 平台的链路完成速度 -- 元数据支持方面,多了很多可选的选项,可以在 YAML 配置里编写很多类似注释级别的 `metadata` -- 通过模板语法支持 `matrix build` - -![Build Log Highlight](./build-log-highlight-support.png) - -## Migrations - -下面这里是我个人对 `jest-properties-loader` 在使用 TypeScript 重写后的的迁移过程。 - -### Step 0: Migration Solution - -我直接放弃了官方的一些迁移手段: 比如先把官方一个迁移工具 clone 到本地,紧接着执行里面的脚本,将对应的 `.workflow` 文件转化成新的 Yaml 文件。 - -根据迁移后的实际流程,然后找官方提供的 Example 进行魔改。 - -目前来说,`jest-properties-loader` 作为一款 NPM Package,触发 CI/CD 的流程大致如下: - -1. 平时提交代码,通过 CI 平台执行 `yarn test`, `yarn build`, 上传覆盖率报告到 **Codecov** -2. 触发 `release` 事件时,通过 CI 平台执行 `yarn build`,之后发布到 `npm registry` 上 - -### Step 1: Copy Official Node-CI Example - -在确立了流程后,我们可以从任意 Repo 的 `Actions` 标签页,选择一些对应语言/平台的 `example workflow` 进行魔改。 - -![Select NPM Example Workflow](./select-sample-workflow-for-npm.png) - -把官方的 `Node-CI` 直接应用到项目本身,也完全 OK。自此,我们第一步,通过 CI 平台执行`yarn test`,`yarn build`的功能就完成了。 - -```yaml -name: Node CI - -on: [push] - -jobs: - build: - runs-on: ubuntu-latest - - strategy: - matrix: - node-version: [8.x, 10.x, 12.x] - - steps: - - uses: actions/checkout@v1 - - name: Use Node.js ${{ matrix.node-version }} - uses: actions/setup-node@v1 - with: - node-version: ${{ matrix.node-version }} - - name: npm install, build, and test - run: | - npm install - npm run build --if-present - npm test -``` - -### Step 2: Upload coverage report to Codecov - -上传通用的 `lcov.info` 覆盖率报告文件到 CodeCov, 这一步也从之前的野生第三方 actions 切换到官方的 actions。 - -如下图,我们使用 新的 `${{secrets.CODECOV_TOKEN}}` 来获取 repo secrets 中的对应变量 - -```yaml -- name: Upload coverage to Codecov - uses: codecov/codecov-action@v1.0.0 - with: - token: ${{secrets.CODECOV_TOKEN}} - file: ./reports/coverage/lcov.info - flags: unittests - name: codecov-umbrella -``` - -### Step 3: Trigger Build Step with Condition (Release) - -在进行发布时,按照上篇博文的 **创建 Release 触发新的 Workflow** 的思路,原本的`HCL`配置应该这么写: - -**Before:** - -```hcl -workflow "release" { - on = "release" - resolves = ["npm:release"] -} - -action "filter:release" { - uses = "actions/bin/filter@master" - args = "action created*" -} - - -action "npm:release" { - needs = "filter:release" - uses = "actions/npm@master" - secrets = ["NPM_AUTH_TOKEN"] - args = "publish" -} -``` - -**After:** - -```yaml -name: Publish to NPM - -on: - release: - branches: - - master - -jobs: - build: - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v1 - - uses: actions/setup-node@v1 - with: - node-version: 10 - - run: | - yarn install - yarn test - yarn build - - publish-npm: - needs: build - runs-on: ubuntu-latest - steps: - - uses: actions/checkout@v1 - - uses: actions/setup-node@v1 - with: - node-version: 10 - registry-url: https://registry.npmjs.org/ - - run: | - yarn install - yarn test - yarn build - - run: npm publish - if: github.event.action == 'created' - env: - NODE_AUTH_TOKEN: ${{secrets.NPM_AUTH_TOKEN}} -``` - -还是原汁原味的读取 `NPM_AUTH_TOKEN` secrets 变量, 使用 `if` 条件表达式代替原来的 `bin/filter` + `args` 条件表达式。 - -即可完整地代替原本的 HCL 语法 Workflow 。 - -## References - -- [Migrating GitHub Actions from HCL syntax to YAML syntax](https://help.github.com/en/articles/migrating-github-actions-from-hcl-syntax-to-yaml-syntax) -- [Automating your workflow with GitHub Actions](https://help.github.com/en/categories/automating-your-workflow-with-github-actions) diff --git a/data/posts/2019/11/14/monorepo-practice-in-typescript-projects.md b/data/posts/2019/11/14/monorepo-practice-in-typescript-projects.md deleted file mode 100644 index 0b6f70528..000000000 --- a/data/posts/2019/11/14/monorepo-practice-in-typescript-projects.md +++ /dev/null @@ -1,415 +0,0 @@ ---- -title: 'Monorepo Practice in TypeScript Projects' -id: monorepo-practice-in-typescript-projects -created: 2019-11-14 -updated: 2019-11-14 -categories: - - Blog -tags: - - NPM - - CI - - Typescript - - Monorepo - - Lerna - - Jest -cover: ./cover.png ---- - -# Monorepo Practice in TypeScript Projects - -## Background - -随着 Node 生态社区的发展,越来越多的 Nodejs 代码仓库开始采用 Monorepo 的形式进行管理。 - -我们可以看到一些前端 UI 框架、Web 框架,在新版本/一开始就采用了 monorepo 的形式管理代码。 - -> Q: 什么是 Monorepo? Monorepo 的基本好处是什么? A: 这里不多阐述,可以参考知乎上的这篇文章: [https://zhuanlan.zhihu.com/p/31289463](https://zhuanlan.zhihu.com/p/31289463) - -由于在其他(这里值一些较较成熟的语言,如 Java)语言的包管理生态圈里已有很多成熟的解决方案,如 Java 世界里的 Maven/Gradle , 其中都有很多较佳时间在各种成熟的开源项目中有所体现。 - -由于 Node 官方的 NPM Package Registry 并没有提供一个官方的解决方案,所以目前会有很多的实现方案,比如 `Lerna` , `Yarn Workspace` , 更甚者还有一些脚本替换发布方案。 - -近期我在好几个 Node with TypeScript 项目(不管是工作中的项目抑或是自己的项目)中,都尝试开始使用 MonoRepo 的形式去组织代码结构与 CICD 流程。这里分享下对 Nodejs + TypeScript 的一些实践经验。 - -### Problems - -要使用 Multi-Repo 还是 Monorepo,又或者是如何从 Multi-Repo 迁移、进化成 Monorepo,不同项目都面对着不同的问题。 - -我抽取了我在这些 Node 项目在修改成 Monorepo 的过程中面对的一些问题,并在下文中给出目前我觉得较为合理的一些解决方案。 - -### Unit Testing - -常规的 Multi-Repo 的单元测试流程理解起来会比较简单。以常见的 `jest` 来看,为单个 Repo 设立单元测试、为单个 Repo 关联代码覆盖率相关的 CI 配置/平台,都相当简单。 - -当执行 Jest 进行单元测试的时候,我们只用维护一份 Jest 配置。接着一切便按照 Jest 的配置正常运作: - -- 代码覆盖率相关的 CI 读取 Jest 配置中执行的 `coverageDirectory` 下相关的覆盖率报告文件 -- 需要在 Jest 执行的各种生命周期的 Hooks 都能通过配置一目了然 - -当我们把这些 Multi-Repo 集中到一起的时候,问题便显现出来: - -- 运行测试是逐个 sub-packages 单独执行测试吗? 如 `lerna run test` -- 单元测试的配置应该如何维护? 维护在 Monorepo 的根目录还是单独拆分维护在每个 sub-package 下? -- 如何像 Multi-Repo 那样与测试覆盖率的 CI 集合起来? - -### With Path Alias - -现代化的 Node 项目(不管是 Node 还是前端项目),很多时候都有一些 Path-Alias 的技巧。 - -具体表现是什么呢? 比如利用 `webpack` 的 `path.resolve` 配置或者 `tsconfig` 中 `paths` 的配置,做一些路径的简化,以节省各种相对路径 `../..` 的使用。 - -如 `@/main.ts` 实际指向了以项目根目录开始的 `/src/main.ts` - -但是,在 Monorepo 下,我们在想方便的以 `@` 作为 Path Alias,则会遇到很多思考上的问题。 - -在单个 sub-package 内 理解并执行 Path-Alias 会跟原先的方式没什么区别。但是需要 root package - -也理解 sub-package 的 Path-Alias 配置则需要很多思考: - -- 多个使用 相同 Path-Alias 的 sub-packages 之间应该如何解析? 大家都在使用相同的 `@` 作为引用路径别名的情况下,怎么知道此时的 `@` 是对应哪一个正确的绝对路径呢 -- 在 Monorepo 的情况下,还该不该使用 Path-Alias - -相信这些问题,都将是大家在实践中会遇到的一些问题 - -## Example Project - -下面以我的博客 repo v6.0 使用 TypeScript 和 monorepo 形式开发过程中的一些实践。 - -将会以比较简短的篇幅描述项目 Monorepo 相关的目录结构与相关的配置、配置要点,并描述如何/使用哪种方案去解决以上提及的问题。 - -### Example Project Introduction - -整个项目源代码形式都以 TypeScript 作为主要语言,项目使用 Monorepo 进行一些子库之间的依赖管理。 - -其中使用 Lerna 进行 Monorepo 管理的主要工具,使用 Jest 作为贯穿所有子项目的单元测试框架。 - -整类子项目大概可以分成以下几种类型: - -``` -packages -├── api-generator // 工具库 - 根据路由元数据生成 JSON 结构的 API 相应内容 -├── application // 核心构建流程,将所有工具库和主题通过一定的构建流程串联在一起 -├── article-tools // 工具库 - 将 .md 文件结合Markdown工具库转换成文章数据 -├── common // 公共接口声明 -├── config // 工具库 - 博客项目的配置发现与读取类库 -├── markdown // 工具库 - 基于 Markdown-it 编写的各种扩展与插件 -├── migration // 临时脚本 - 将上一版本的博客应用数据结构迁移的脚本集 -├── pwa-tools // 工具库 - 基于 Workbox 生成 PWA 相关资源 -├── routes-tools // 工具库 - 生成静态路由极其 Meta 信息 -├── theme-react // 基于 React + Material UI 的前端主题 -└── theme-vue // 基于 Vue + vuematerial 的前端主题 -``` - -按照互相之间的依赖路径,会是以下这样一种依赖形式: (可以通过 `lerna ls --graph --all` 命令查看 ) - -``` -{ - "@blog/api-generator": [ - "@blog/article-tools", - "@blog/common", - "@blog/routes-tools" - ], - "@blog/application": [ - "@blog/api-generator", - "@blog/article-tools", - "@blog/common", - "@blog/config" - ], - "@blog/article-tools": [ - "@blog/common", - "@blog/markdown" - ], - "@blog/common": [ - ], - "@blog/config": [ - ], - "@blog/markdown": [ - "@blog/common" - ], - "@blog/migration": [ - "@blog/article-tools", - "@blog/markdown" - ], - "@blog/pwa-tools": [ - "@blog/common" - ], - "@blog/routes-tools": [ - "@blog/article-tools", - "@blog/common", - "@blog/config" - ], - "@blog/theme-react": [ - "@blog/common", - "@blog/config" - ], - "@blog/theme-vue": [ - "@blog/common", - "@blog/config", - "@blog/routes-tools" - ] -} -``` - -### Example Project Folder Structure - -现在我们来看整个项目的结构,以及其特点 (为节省面板,省略部分结构相同的工具类库目录) - -``` -├── CHANGELOG.md -├── LICENSE -├── README.md -├── data -├── lerna.json -├── package.json -├── packages -│   ├── api-generator -│   ├── application -│   │   ├── README.md -│   │   ├── dist -│   │   ├── nest-cli.json -│   │   ├── package.json -│   │   ├── src -│   │   ├── tsconfig.build.json -│   │   ├── tsconfig.json -│   ├── article-tools -│   ├── common -│   │   ├── README.md -│   │   ├── constants -│   │   ├── interfaces -│   │   ├── package.json -│   │   ├── tsconfig.json -│   │   ├── utils -│   │   └── yarn.lock -│   ├── config -│   │   ├── README.md -│   │   ├── dist -│   │   ├── package.json -│   │   ├── src -│   │   ├── tsconfig.json -│   │   └── yarn.lock -│   ├── markdown -│   │   ├── README.md -│   │   ├── dist -│   │   ├── package.json -│   │   ├── src -│   │   ├── tsconfig.json -│   │   └── yarn.lock -│   ├── migration -│   ├── pwa-tools -│   ├── routes-tools -│   ├── theme-react -│   │   ├── dist -│   │   ├── gulpfile.ts -│   │   ├── package.json -│   │   ├── src -│   │   ├── tsconfig.json -│   │   ├── tsconfig.webpack.json -│   │   ├── webpack -│   │   ├── yarn-error.log -│   │   └── yarn.lock -│   └── theme-vue -│   ├── README.md -│   ├── dist -│   ├── package.json -│   ├── public -│   ├── src -│   ├── tsconfig.json -│   ├── vue.config.js -│   ├── vue.config.ts -│   ├── webpack.config.js -│   └── yarn.lock -``` - -可以看到,中间标记为工具库一类的类库,都有 `dist` 和 `src` 目录,作为以 TypeScript 为源代码的语言,我给他们人为地设定了基本的 `TypeScript` + `Lib` 的构建路径,所以在单个项目接口来看,他们都具有通用的 `tsconfig.json` 和 jest 使用的 `jest.config` (in package.json). - -**tsconfig.json** - -```json -{ - "compilerOptions": { - "module": "commonjs", - "declaration": true, - "removeComments": true, - "emitDecoratorMetadata": true, - "experimentalDecorators": true, - "resolveJsonModule": true, - "target": "es6", - "sourceMap": true, - "outDir": "./dist", - "baseUrl": "./", - "incremental": true, - "noImplicitAny": false, - "typeRoots": ["./node_modules/@types"] - }, - "exclude": ["node_modules", "dist", "reports", "**/*.test.ts"] -} -``` - -**jest.config** - -```json -{ - "jest": { - "moduleFileExtensions": ["ts", "js", "json"], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": ["!**/__tests__/**", "/src/**/*.ts"], - "testMatch": ["/src/**/*.test.ts"], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} -``` - -对于单一一个类库,必定有着两步 npm script: build, test: - -- npm run build 将执行命令 tsc,根据 tsconfig.json 的配置,将 `src` 目录下的 ts 文件转译成 `dist` 目录下的 commonjs 文件。 -- npm run test 将执行命令 jest,根据 jest 的配置, 读取 `"/src/**/*.test.ts"` unix path glob pattern 匹配的测试文件执行单元测试。 - -## Solutions - -言归正传,我们回去之前引出的  Monorepo 在实践中诞生的第一个问题 - -- UnitTesting cross Monorepo - -不过我们可以由浅入深,先来看看这些由纯工具类库构造成的 Monorepo 如何执行单元测试,再进化到遇到其他乱七八糟复杂的复杂情况时,应当如何应对。 - -### UnitTesting cross Monorepo - -由于我们可以直接在每个工具类库下单独执行 `npm run test` 来分别执行单元测试,那么则可以在项目根目录下通过 `lerna run test` 执行。 - -啊哈!思考往往没有这么简单。我们执行单元测试的过程,不仅仅是为了为了分批执行多个子项目的单元测试,判断中间是否有非 0 退出的错误案例,用以快速反馈本次 commit 是否可能造成了破坏性的后果。还能结合很多测试结果相关的 CI 工具,反馈单元测试覆盖率等关键指标。 - -在开源界免费、支持度比较好覆盖率分析平台,不得不说 [Codecov](https://codecov.io/) 。基本上在尝试过所有类似的、免费的覆盖率报告分析平台后,最后都选择迁移到 Codecov 上。 - -那这里引发的关键思考点是: 我们分别在每个子项目下分别执行 `npm run test` ,根据各个子项目的 jestconfig, 将会在不同的子项目目录下生成覆盖率报告 (这里特指 jest 生成的 linux 标准的 [lcov.info](http://lcov.info) ) 文件。那么我们的 codecov 有办法汇总多个不同的 lcov.info 文件并为 Monorepo 的单个大项目仓库生成一份覆盖率报告吗 ? - -我至今还没为这种思路找到经典、合理的解决方案。 - -如果在 Monorepo 的根目录里执行 `jest` ,接着为整个项目生成一份覆盖率报告,这样来说是不是比较方便的解决这个问题呢? - -由于我们的工具类子仓库大部分都含有相同的目录结构,比如单元测试文件都可以用固定的 glob 表达式进行匹配,因而我们可以在项目根目录里安装 `typescript` 和 `jest` 相关依赖,然后自上而下的查找 sub-packages 中的测试文件执行测试,使得我们可以通过在不同的路径执行 `jest` 而达到不同的效果: - -- 在根目录下执行 jest,可以一次执行所有 sub-packages 的测试,并在根目录指定的文件夹下生成覆盖率报告 -- 在单个 sub-package 下执行 jest,可以快速的执行单个 sub-packages 下的测试,执行效率高,且能适配大多数 IDE、Editor 的单元测试执行上下文自动发现 - -所以这里的解决方案的前置要求便是: - -- 每个 sub-package 具有类似的、统一的测试路径结构风格 -- 每个 sub-package 使用到的 jest extension 不允许存在互斥的情况 -- 根目录的 jest 配置是所有 sub-packages 的并集 - -到了这里,我们可以看一下在这种解决方案下的更目录的 jestconfig: - -其中关键的节点是 `testMatch` 与 `collectCoverageFrom` 下的字段,用 `/packages/**/src/**/*` 去匹配所有 sub-packages 下的源代码与测试代码。 - -```json -{ - "jest": { - "moduleFileExtensions": ["ts", "js", "json"], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "!**/src/index.ts", - "!**/src/main.ts", - "!**/src/plugins.ts", - "!**/*.module.ts", - "!**/migration.ts", - "/src/**/*.ts", - "/packages/**/src/**/*.ts" - ], - "testMatch": ["/src/**/*.test.ts", "/packages/**/src/**/*.test.ts"], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} -``` - -如此一来,我们便解决了基本的问题。 - -下面我们来看糅合了实际工作中一些复杂的情况:整合带 Path Alias 的子项目。 - -### Path Alias - -这里将会以博客项目中的 `application` 子项目作为讲解。 - -`application` 子项目是一个使用了 nestjs 作为基本框架的项目,他在整个博客应用中起的作用是: - -调用其他类库,并按照一定的顺序执行整个构建工作流,如扫描读取所有 Markdown 文件并提取其元数据,根据元信息构造路由,根据路由信息构造 API 内容 等等。 - -所以作为整个 Monorepo 中具有唯一性的、没有被其他 sub-package 依赖的调用方项目,我为他设置了一个 Path Alias: `@-> ./src` - -在子项目路径  `packages/application` 目录下,tsconfig.json 中的路径别称是: - -```json -{ - "compilerOptions": { - "paths": { - "@/*": ["src/*"] - } - } -} -``` - -在 jestconfig 中也需要声明这个 Path Alias: - -```json -{ - "jest": { - "moduleNameMapper": { - "^@/(.*)$": "/src/$1" - } - } -} -``` - -为了遵循原先的解决方案需求,为应对这种具有 Path Alias 的子项目,我的实践如下: - -- 子项目本身设立 Path Alias -- 子项目的测试代码结构与其他类库路径风格对齐 -- 根目录在 jestconfig 下添加相关的 Path Alias 设置,即: - -```json -{ - "jest": { - "moduleNameMapper": { - "^@/(.*)$": "/packages/application/src/$1" - } - } -} -``` - -这样便能解决 Path Alias x UnitTesting 的问题。 - -### Summarization - -结合在项目中遇到的以上问题以及目前的实践方案,也反向引导出我们在项目中也需要遵循一些认为的、容易接受的约定。 - -总结这几个问题的解决思路,可以汇总成一下个情景。 - -我们在遇到 TypeScript + Jest + Monorepo 的场景时,需要遵循以下几种约定: - -- 带有 Path Alias 的子项目,尽量不要被其他子项目所依赖 -- 所有子项目的测试文件路径表达式尽量一致 - -在执行测试和构建方面,则需要注意以下的配置方式: - -- 在根目录下执行 Jest,Jest 的配置是所有子项目 Jest 配置的并集 -- 在根目录下执行 Jest,Jest 的配置需要理解带 Path Alias 子项目的路径 Mapping (即需要根目录理解此子项目的 Mapping 配置) - -### More Complex Situation - -中间还会发现有一些更复杂的情况 - -- 多个子项目的 tsconfig.json 不尽相同,无法合并出合理的并集? -- 子项目是一个前端项目,用到了很多不同的 loader 与插件,如 React/Vue ,根目录的配置是否也需要理解并安装这些 loader 呢? - -## References - -- Lerna at Github [https://github.com/lerna/lerna](https://github.com/lerna/lerna) -- Javascript Monorepo with Lerna [https://medium.com/@erzhtor/javascript-monorepo-with-lerna-5729d6242302](https://medium.com/@erzhtor/javascript-monorepo-with-lerna-5729d6242302) diff --git a/data/posts/2020/01/28/review-2019.md b/data/posts/2020/01/28/review-2019.md deleted file mode 100644 index 2887c6446..000000000 --- a/data/posts/2020/01/28/review-2019.md +++ /dev/null @@ -1,80 +0,0 @@ ---- -title: 'Year in Review 2019' -id: review-2019 -created: 2020-01-28 -updated: 2020-01-28 -categories: - - Blog -tags: - - Diary -cover: ./cover.png ---- - -# Year in Review 2019 - -新年伊始,适逢 2019-nCov 疫情,多了几天在家办公的时间。 - -正好回顾下忙碌的 2019 年。 - -## Movement - -19 年 4 月,得知房租将上涨后,找了一家新的房子,从步行上班距离的城中村搬到了骑车上班距离的远一点的空公寓。 - -家具又是自己买了一番(虽然大出血但是内心自我感觉住宿品质回到了在珠海的时候)。 - -![New Apartment](./movement.png) - -这次主要都从宜家购置的新家具,电脑桌/床/床头柜 都是山榉木材料 + 原木色,风格比较一致。 - -有空可以整理下 2019 年购置的装备列表。 - -现在下班后回到家终于有一丝轻松躺在椅子上的感觉了。 - -## Reading - -说是阅读其实也不算是,因为今年读的有一部分属于一些入门工具书,并不是每本都可以反复阅读,何况还有一些还没读完 - -好歹是动物书系列,也都简介下 - -- Kubernetes 即学即用 [https://item.jd.com/12476325.html](https://item.jd.com/12476325.html) 100% -- Kubernetes 经典实例 [https://item.jd.com/12446081.html](https://item.jd.com/12446081.html) 80% 上面两本都属于比较薄的 kubernetes 入门工具书了,不同的译者水平讲述了一些 k8s 官方原生的概念,阅读顺序稍微比官方文档要容易理解一些。推荐指数: ★★★ -- 数据密集型应用系统设计 [https://item.jd.com/12437624.html](https://item.jd.com/12437624.html) 25% 大名鼎鼎的 DDIA,在中国电力出版社的翻译版本出来之前,在 Github 上粗略瞟过一些用户自己翻译的版本,很多都不能完全做到 "达" 字。 - -这本书译者水平相对较高,所以针对翻译中提到的种种案例,都能够很好的表达出来。在工作了若干年后接触到这本书,开头阶段会得到许多共鸣,中间则可以在很多经典案例与底层数据结构解析的过程中,得到很多工程化的经验、技术选型的依据。 - -还没细读完,但已经补到了很多知识,强烈推荐 - -推荐指数: ★★★★★ - -- 上瘾 (Hooked) [https://item.jd.com/12203529301.html](https://item.jd.com/12203529301.html) 100% - -非技术书籍。阅读过程比较轻松,可能算是产品向的一些业余书籍。不提供推荐指数 - -## Apps - -2019 年新增订阅软件: - -### Notion - -一款功能比较全的笔记、看板软件。 - -个人看重以下一些优点 - -- 界面好看 -- 全平台支持 (Web/iOS/Android) -- 近似 Markdown 的编辑体验 - -一些比较纠结的缺点 - -- 导出成 Markdown 格式时,代码片段格式不尽人意 -- 一些中文字符支持不够好 - -## Work - -工作任务一直很繁重。 - -在五六个项目直接切换不同开发角色,经常有计划以外的线上问题要处理。基本上已经违反了很多 Engineering 的基本原则。 - -这纵使令人难受,亦是挑战。 - -希望假期能够汇总过去的一些案例,调整过来。 diff --git a/data/posts/2020/03/01/keep-your-repo-dependencies-up-to-date-with-renovate.md b/data/posts/2020/03/01/keep-your-repo-dependencies-up-to-date-with-renovate.md deleted file mode 100644 index 99bd417ad..000000000 --- a/data/posts/2020/03/01/keep-your-repo-dependencies-up-to-date-with-renovate.md +++ /dev/null @@ -1,244 +0,0 @@ ---- -title: '使用 renovate 监控第三方依赖更新' -id: keep-your-repo-dependencies-up-to-date-with-renovate -created: 2020-03-01 -updated: 2020-03-18 -categories: - - Blog -tags: - - Node - - CI - - NPM - - Renovate - - Github -cover: ./revonate.png ---- - -# 使用 renovate 监控第三方依赖更新 - -## 背景 - -依赖更新管理曾一直是我容易纠结的一个问题。 - -我希望一直保持我的仓库使用的第三方 package 一直能保持 **安全情况下** 的最新版本, - -随着最新版本的更新,一般伴随着以下几种用户关心的内容: - -- fixing: 修复了一些 bug -- performance tuning: 一些性能优化 -- feat: 一些新的功能 -- BREAKING CHANGES: 这个版本相对老版本是否有破坏性的更变,可以帮助开发者决策是否直接升级(亦或是不升级,或先兼容后升级) - -不管是使用什么语言的项目,对应生态中,成为事实标准的包管理器,一般都会提供如下的依赖更新相关功能: - -- 查找项目中已安装的依赖版本,对比包管理器 registry 上的最新可升级版本,并提示最新版本可用 -- 在按照原依赖描述文件安装依赖时,可提示开发者有版本更新 - -对于接触的比较多的 node 项目,他的事实标准(在我心中的)可能是 `yarn` 。 - -因为 npm 及其 package 的版本号设计,遵循了 [semver versioning](https://semver.org/) 的一些基本原则,使得更新依赖稍微能程序化一点。 - -在简单的情况下,对于常见的 node + singlerepo 类型项目,我一般怎么做? - -我使用的是 yarn 的 `upgrade-interactive --latest` - -官方文档: [https://classic.yarnpkg.com/en/docs/cli/upgrade-interactive/](https://classic.yarnpkg.com/en/docs/cli/upgrade-interactive/) - -通过这种手动在本地执行命令的形式,可以手动的获取项目所依赖的 npm package 的最新版本与本地依赖描述文件之间的差异,以及提供一个交互式的命令供你选择你想要更新的部分 packages。 - -之后在下载安装最新版本的依赖时,同时把那些依赖的最新版本的版本号持久化到以来描述文件 `package.json` 中去。 - -毫不犹豫的说,对于开源(托管在 github 上的 public repo),亦或是依赖了一些私有 registry 的 package 的代码仓库,这种人工更新的方式还是比较有效的。 - -## 现状 - -这种情况会有什么缺陷呢? - -把目前的人工更新方案,拆分成两种类型的缺陷: - -1. 人工执行命令以主动检查更新 (主动拉取) -2. 对于 lerna + monorepo 而言,yarn 的 `upgrade-interactive` 对 monorepo 没有很好的支持 - -人工执行命令以主动检查更新,对于依赖个数上百、数百的项目来讲,着实有点麻烦。相当于一个人工轮询去检查是否有更新,而且放到开发者这方的话,开发体验不会很好。 - -稍微好的一点方案,是自动化的执行这些更新脚本,并开发一个类似 `-y` 一样的参数来实现静默允许直接更新依赖,以跳过交互式命令的步骤。 - -对于 lerna + monorepo 呢? - -yarn 的 `upgrade-interactive` 没能很好的支持,技术原因上是因为 lerna 在执行命令 `bootstrap` 的过程中,对 monorepo 内具有互相依赖关系的私有子项目,并不是直接在每个子仓库下直接进行 `npm install` or `yarn install` 。而是会拆分成几个步骤,这里为了简述,描述成几步: - -1. 对 monorepo 下的多个 package.json 进行 backup,复制一份备份版本 -2. 对 monorepo 下的多个 package.json 进行修改,移除 monorepo 的私有依赖声明 -3. 在每个仓库下分别执行 install 命令,此时能确保都能从远程的 registry 上下载其他非私有依赖 -4. 进行 symlink,将私有依赖创建 symlink,根据计算好的依赖关系,连接到每个子项目的 `node_modules` 对应的目录中去 -5. 将备份版本的多个 package.json 归回原位 - -由于 `yarn upgrade-interactive` 并不能识别 package.json 中的私有依赖,在 monorepo 中执行此命令,会使得由于无法在远程 registry 中查找到私有依赖,导致错误退出。 - -yarn 在最近发布了 v2 大版本,不知道对他原生的 monorepo (workspace) 支持,是否会考虑到这里的增强。 - -这是我认为人工执行 `yarn upgrade-interactive` 所存在的问题。 - -那么常规的开源项目,或者我曾了解的开源项目怎么做? - -对,那就是代码仓库平台集成的一些依赖监控服务,在没有讲到今天的主题之前,我所了解到的是: - -- [https://dependabot.com/](https://dependabot.com/) depent bot -- [https://david-dm.org/](https://david-dm.org/) david-dm 专门为 node.js 类 repo 提供依赖版本监控 - -有一说一 david-dm 确实比较菜了,作为第三方服务基本处于不可用的状态。 - -depentbot 可以提供监控多语言的依赖更新、协助更新依赖切提 Pull Request 等自动化服务。 - -然而在轻度体验后的结果,发现有几点不太理想: - -1. 不能自动识别仓库的主要编程语言,需要在其官方平台网站上进行手动确认 -2. 对于 monorepo 的问题,直至 2020 年 2 月底也没有好的解决方案 - -## 遇见 Renovate - -最近一段时间比较关注 NestJS 社区及其成长,在学习源码时了解到一些相关的 CI 方面的实践。 - -在此过程中,了解到他们为了解决监控依赖更新问题,使用的服务是 Renovate。 - -### Renovate 是什么 - -Renovate 是一家名为 WhiteSource 的公司开发的一项适用于多种语言的依赖更新监控服务。目前在 Github 上以 Github Apps 的形式,可以为接入此 Apps 的项目提供依赖更新监控相关的服务。 - -## Renovate 使用体验 - -简单来讲,目前我在 Github 上使用 Renovate 的时间相对少,但是满足了我目前对依赖更新管控的基本需求,还有一些意外惊喜。 - -简要分享下,在安装、使用 renovate 中的一些过程。 - -### 1. 安装 renovate - -与常见的 Github Apps 一样,首先在 Github Market place 搜索 renovate 并进行安装,按需授予权限。 - -[https://github.com/marketplace/renovate](https://github.com/marketplace/renovate) - -### 2. 配置 renovate - -安装 renovate 之后无须手动操作,等待即可。 - -此时 renovate 将会扫描你授予权限的仓库,做一些简要分析,分析你的各个项目主力语言、依赖管理方式,之后将会对哪些可以被通过 renovate 管控更新依赖的仓库,分别提交一份 Pull Request. - -这个 Pull Request 的标题叫做 **Configure Renovate,**中间附带了一个文件更变,便是他会在项目的更目录下添加 `renovate` 的配置文件,名为 `renovate.json` 。 - -这个配置文件描述了一些 renovate 管理此仓库依赖的相关选项,默认生成的 `config:base` 已经能够满足日常需要。 - -如果开发者很不爽在项目根目录增添这样一个 json 配置文件,可以按照他们官方配置发现的目录查找顺序,移入到 `.github` 目录下。(具体查找规则,官方有较为详细的说明文档) - -![revonate-setup-pull-request](./revonate-setup-pull-request.png) - -### 3. Pin Dependencies - -合入第二点的 PR 后,不久变会收到第二个初始化类型的 PR: Pin Dependencies。 - -顾名思义,这里的意思是锁定目前的依赖版本,且为以后持续接受依赖更新做准备。 - -以 Node 类型仓库来讲,这个 PR 的具体内容,便是先在每个 package.json 内容中,将依赖的版本无损更新到最新版本号 (指符合 semver versioning 的更新规则) 之后,去除 package.json 中每项依赖版本号之前的 `^` 和 `~` ,以将模糊的 semver versioning 版本监控行为变成确定的版本号。 - -### 4. Update Dependencies - -在第三点之后,接下来的就是日常的监控依赖更新了。 - -每当依赖的新版本发布时,他会针对单条依赖的更新提交 PR,如果依赖中有符合标准的 CHANGELOG 也会直接加入到 PR 的 description 中。 - -其中,他们也会对发布到 registry 的依赖文件内容,进行 diff,以生成 renovate 自己分析的的"依赖构建产物" diff,以供查看。 - -在对项目的依赖描述文件扫描、分析更新这部分,能够对 monorepo 提供很好的支持。 - -比方说在我的 blog 项目中,有一个属于主题的子项目,该子项目依赖了一些 blog 项目的私有依赖,大概如下: - -```json -{ - "dependencies": { - "@blog/common": "^6.21.4", - "@blog/config": "^6.21.4", - "@loadable/component": "5.12.0", - "@material-ui/core": "4.9.4", - "@material-ui/icons": "4.9.1", - "axios": "0.19.2", - "classnames": "2.2.6", - "clsx": "1.1.0", - "date-fns": "2.10.0", - "github-markdown-css": "4.0.0", - "highlight.js": "9.18.1", - "react": "16.13.0", - "react-disqus-components": "1.2.2", - "react-dom": "16.13.0", - "react-helmet": "5.2.1", - "react-router-dom": "5.1.2", - "scroll-into-view-if-needed": "2.2.24", - "typeface-roboto": "0.0.75", - "vanilla-lazyload": "12.5.1" - } -} -``` - -其中 package.json 中 `@blog` 开头的两个依赖,属于私有依赖,在 npm registry 中无法找到。对于非私有依赖, renovate 都能逐个帮你进行监控,并在版本更新的时候及时提出 PR。 - -### 意外惊喜 - -惊喜的是,在 merge PR 时,renovate 还会增加一个 `conventional commit` 的检测: 如果你在项目中显式地配置了主流的 commit lint 以及 commit message 风格检测,他会按照这些常见的风格来修改 PR 的标题: - -如: [https://github.com/aquariuslt/blog/pull/38](https://github.com/aquariuslt/blog/pull/38) - -标题为: chore(deps): update dependency @types/node to v13.7.7 #38 - -对于 circleci,也提供了的 CI 环境下的 docker-image 版本监控 - -![circle-ci-docker-image-support](./revonate-circleci-docker-image-support.png) - -> updated at 2020-03-18 - -在使用了一段时间 renovate 之后,发现 renovate 已经提供了很多 automerge 的判断条件,以减少人工合并这种机械化请求的次数。 - -我个人来讲,目前使用如下配置,来做到: - -1. 不管项目是否使用了的 `semantic-release`,bot 的 PR 风格 commit 也会自带 semantic-release 风格 -2. 在 `dependencies` 非 major 更新时,所有 checking pass 之后,自动 merge -3. 在 `devDependencies` 有依赖版本更新时,所有 checking pass 之后,自动 merge - -```json -{ - "semanticCommits": true, - "packageRules": [ - { - "updateTypes": ["minor", "patch", "pin", "digest"], - "automerge": true - }, - { - "depTypeList": ["devDependencies"], - "automerge": true - } - ], - "extends": ["config:base"] -} -``` - -## 结论与思考 - -### 结论 - -renovate 是我目前遇到的最能满足监控依赖更新的一项服务,满足了我在 Github 上的 node 项目监控依赖更新的迫切需求。 - -目前我所有放在 github 上的 node, java 项目都接入了该项服务。 - -### 思考 - -如果要为私有项目解决这个问题,我可能会怎么做 - -在公司内部,特别是一些大厂,通常有自己的内部源码仓库,如 Self-hosted Gitlab,自研的 Git 平台,还会有内部的各种 包管理对应的私有 registry。 - -如果需要为内部私有项目解决第三方依赖版本管控问题,renovate 的使用体验则为我们带来了一个很好的标杆。 - -如果有一天有空去写轮眼这个服务,我一定会从他的官方文档、配置文档、使用体验中反推基本实现,并将中间 github 操作相关 api 转成内部平台支持的 api,编写插件,设计分析服务。 - -也许需要下面的特性: - -- 依赖描述文件扫描与发现 -- 尽可能分析出可被分析的依赖版本 -- 结合各种语言生态事实标准,更新依赖,生成合理的 commit message -- 对应仓库平台 Pull Request API 适配实现 diff --git a/data/posts/2020/03/15/gitignore.png b/data/posts/2020/03/15/gitignore.png index 7fb7880fb..1f144a062 100644 Binary files a/data/posts/2020/03/15/gitignore.png and b/data/posts/2020/03/15/gitignore.png differ diff --git a/data/posts/2020/03/28/check-motion-sense-at-settings.png b/data/posts/2020/03/28/check-motion-sense-at-settings.png index 8ad6792de..8a0989286 100644 Binary files a/data/posts/2020/03/28/check-motion-sense-at-settings.png and b/data/posts/2020/03/28/check-motion-sense-at-settings.png differ diff --git a/data/posts/2020/03/28/magisk-patch-boot-img.png b/data/posts/2020/03/28/magisk-patch-boot-img.png index 38ffc7fc6..4e2459bf6 100644 Binary files a/data/posts/2020/03/28/magisk-patch-boot-img.png and b/data/posts/2020/03/28/magisk-patch-boot-img.png differ diff --git a/data/posts/2020/03/28/magisk-patch-image-step-1.png b/data/posts/2020/03/28/magisk-patch-image-step-1.png index 1eddc2ea9..240f14733 100644 Binary files a/data/posts/2020/03/28/magisk-patch-image-step-1.png and b/data/posts/2020/03/28/magisk-patch-image-step-1.png differ diff --git a/data/posts/2020/03/28/magisk-patch-image-step-2.png b/data/posts/2020/03/28/magisk-patch-image-step-2.png index f10605e92..5e1103fa4 100644 Binary files a/data/posts/2020/03/28/magisk-patch-image-step-2.png and b/data/posts/2020/03/28/magisk-patch-image-step-2.png differ diff --git a/data/posts/2020/03/28/magisk-superuser.png b/data/posts/2020/03/28/magisk-superuser.png index c3fd82742..175304d20 100644 Binary files a/data/posts/2020/03/28/magisk-superuser.png and b/data/posts/2020/03/28/magisk-superuser.png differ diff --git a/data/posts/2020/03/28/magisk.png b/data/posts/2020/03/28/magisk.png index 272879934..a8536397d 100644 Binary files a/data/posts/2020/03/28/magisk.png and b/data/posts/2020/03/28/magisk.png differ diff --git a/data/posts/2020/03/28/motion-sense-skip-song.png b/data/posts/2020/03/28/motion-sense-skip-song.png index b662bc464..1cc20e972 100644 Binary files a/data/posts/2020/03/28/motion-sense-skip-song.png and b/data/posts/2020/03/28/motion-sense-skip-song.png differ diff --git a/data/posts/2020/04/04/bundlewatch-graph.png b/data/posts/2020/04/04/bundlewatch-graph.png index 59995bd84..e90c2a452 100644 Binary files a/data/posts/2020/04/04/bundlewatch-graph.png and b/data/posts/2020/04/04/bundlewatch-graph.png differ diff --git a/data/posts/2020/04/04/ci-commit-status.png b/data/posts/2020/04/04/ci-commit-status.png index 923804588..30a6e00e7 100644 Binary files a/data/posts/2020/04/04/ci-commit-status.png and b/data/posts/2020/04/04/ci-commit-status.png differ diff --git a/lerna.json b/lerna.json index b149182c2..00040ecb4 100644 --- a/lerna.json +++ b/lerna.json @@ -4,6 +4,6 @@ "service/", "themes/*" ], - "version": "6.26.198", + "version": "6.25.9", "npmClient": "yarn" } diff --git a/package.json b/package.json deleted file mode 100644 index 02846a105..000000000 --- a/package.json +++ /dev/null @@ -1,162 +0,0 @@ -{ - "name": "zexo.dev", - "version": "6.26.198", - "private": true, - "description": "source code of https://zexo.dev", - "repository": "https://github.com/aquariuslt/blog", - "author": "Aquariuslt ", - "license": "MIT", - "keywords": [ - "blog", - "typescript" - ], - "scripts": { - "build:service": "lerna run build:service --stream", - "build:libs": "lerna run build:lib --stream", - "build:themes": "lerna run build:theme --concurrency 1 --stream", - "build:all": "yarn build:libs && yarn build:themes && yarn build:service", - "build:prod": "yarn build:all && lerna run build:service:prod --stream", - "bootstrap": "lerna bootstrap", - "clean": "lerna run clean", - "lint": "eslint --fix", - "pretest": "yarn build:libs", - "test": "jest", - "pretest:cov": "yarn build:libs", - "test:cov": "jest --coverage", - "bundlewatch": "bundlewatch", - "release": "semantic-release" - }, - "devDependencies": { - "@commitlint/cli": "12.0.1", - "@commitlint/config-conventional": "12.0.1", - "@semantic-release/changelog": "5.0.1", - "@semantic-release/exec": "5.0.0", - "@semantic-release/git": "9.0.0", - "@semantic-release/npm": "7.0.10", - "@types/jest": "26.0.20", - "@types/node": "13.13.45", - "@typescript-eslint/eslint-plugin": "4.16.1", - "@typescript-eslint/parser": "4.16.1", - "bundlewatch": "0.3.2", - "eslint": "7.21.0", - "eslint-config-prettier": "8.1.0", - "eslint-plugin-prettier": "3.3.1", - "husky": "4.3.8", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "lerna": "4.0.0", - "prettier": "2.2.1", - "pretty-quick": "3.1.0", - "rimraf": "3.0.2", - "semantic-release": "17.4.1", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "husky": { - "hooks": { - "pre-commit": "pretty-quick --staged", - "commit-msg": "commitlint -E HUSKY_GIT_PARAMS" - } - }, - "commitlint": { - "extends": [ - "@commitlint/config-conventional" - ], - "rules": { - "type-enum": [ - 2, - "always", - [ - "chore", - "feat", - "fix", - "docs", - "style", - "refactor", - "test", - "ci", - "perf", - "revert", - "pages", - "posts" - ] - ] - } - }, - "jest": { - "globals": { - "ts-jest": { - "diagnostics": false, - "tsConfig": { - "experimentalDecorators": true - } - } - }, - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "moduleNameMapper": { - "^@/(.*)$": "/service/src/$1" - }, - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/service/src/**/*.ts", - "/packages/**/src/**/*.ts" - ], - "testMatch": [ - "/service/src/**/*.test.ts", - "/packages/**/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - }, - "release": { - "plugins": [ - "@semantic-release/commit-analyzer", - "@semantic-release/release-notes-generator", - "@semantic-release/github", - "@semantic-release/npm", - "@semantic-release/changelog", - [ - "@semantic-release/exec", - { - "prepareCmd": "lerna exec --concurrency 1 -- npm version ${nextRelease.version} && lerna version ${nextRelease.version} --no-git-tag-version --no-push --amend --yes" - } - ], - [ - "@semantic-release/git", - { - "assets": [ - "docs", - "service", - "packages", - "themes", - "lerna.json", - "package.json", - "README.md", - "CHANGELOG.md" - ], - "message": "chore(release): ${nextRelease.version} [skip ci]\n\n${nextRelease.notes}" - } - ] - ] - }, - "bundlewatch": { - "files": [ - { - "path": "dist/static/js/*.js", - "maxSize": "400 kB" - }, - { - "path": "dist/static/css/*.css", - "maxSize": "30 kB" - } - ] - } -} diff --git a/packages/api/package.json b/packages/api/package.json deleted file mode 100644 index cad98f8a2..000000000 --- a/packages/api/package.json +++ /dev/null @@ -1,62 +0,0 @@ -{ - "name": "@blog/api", - "version": "6.26.198", - "description": "> TODO: description", - "author": "Aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "private": true, - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "main": "dist/index.js", - "types": "dist/index.d.ts", - "files": [ - "dist" - ], - "scripts": { - "clean": "rimraf dist", - "test": "jest", - "build:lib": "tsc" - }, - "dependencies": { - "@blog/article": "^6.26.198", - "@blog/common": "^6.26.198", - "@blog/router": "^6.26.198", - "date-fns": "2.19.0", - "mkdirp": "1.0.4", - "uslug": "1.0.4" - }, - "devDependencies": { - "@types/fancy-log": "1.3.1", - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/mkdirp": "1.0.1", - "@types/node": "13.13.45", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/packages/api/src/posts.api.util.ts b/packages/api/src/posts.api.util.ts deleted file mode 100644 index 347530393..000000000 --- a/packages/api/src/posts.api.util.ts +++ /dev/null @@ -1,56 +0,0 @@ -import * as _ from 'lodash'; -import * as path from 'path'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { createArticleOverview } from './article.util'; -import { createTagDetailLinkItem } from './tags.api.util'; -import { createCategoryLinkItem } from './categories.api.util'; -import { buildPagePathFromContext, buildPostPathFromContext } from '@blog/router'; -import { isImageHosting } from '@blog/article'; - -const PNG_EXTENSION = '.png'; -const WEBP_EXTENSION = '.webp'; -/** @description simply `/` and `/posts` api response */ -export const createPostsOverviewApiData = (contexts: ArticleContext[]) => _.map(contexts, createArticleOverview); - -export const createPostDetailApiData = (id: string, contexts: ArticleContext[]) => { - const context = _.find(contexts, { id }); - - const contextPath = buildPostPathFromContext(context); - - let html = context.html; - _.each( - context.images.filter((image) => !isImageHosting(image)), - (image) => { - const webpImage = _.replace(image, PNG_EXTENSION, WEBP_EXTENSION); - html = html.replace(image, path.join(contextPath, image)); - html = html.replace(webpImage, path.join(contextPath, webpImage)); - } - ); - - return Object.assign({}, context, { - tags: _.map(context.tags, createTagDetailLinkItem), - categories: _.map(context.categories, createCategoryLinkItem), - cover: path.join(contextPath, context.cover), - html: html - }); -}; - -export const createPageDetailApiData = (context: ArticleContext) => { - const contextPath = buildPagePathFromContext(context); - let html = context.html; - _.each( - context.images.filter((image) => !isImageHosting(image)), - (image) => { - const webpImage = _.replace(image, PNG_EXTENSION, WEBP_EXTENSION); - html = html.replace(image, path.join(contextPath, image)); - html = html.replace(webpImage, path.join(contextPath, webpImage)); - } - ); - - return Object.assign({}, context, { - tags: [], - categories: [], - cover: path.join(contextPath, context.cover), - html: html - }); -}; diff --git a/packages/api/tsconfig.json b/packages/api/tsconfig.json deleted file mode 100644 index f1a0e1870..000000000 --- a/packages/api/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./" - }, - "include": ["src"], - "exclude": ["node_modules", "dist", "reports", "src/__tests__"] -} diff --git a/packages/article/package.json b/packages/article/package.json deleted file mode 100644 index bc5edffb2..000000000 --- a/packages/article/package.json +++ /dev/null @@ -1,67 +0,0 @@ -{ - "name": "@blog/article", - "version": "6.26.198", - "description": "blog markdown article tools", - "author": "Aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "private": true, - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "main": "dist/index.js", - "types": "dist/index.d.ts", - "files": [ - "dist" - ], - "scripts": { - "clean": "rimraf dist", - "test": "jest", - "build:lib": "tsc" - }, - "dependencies": { - "@blog/common": "^6.26.198", - "@blog/markdown": "^6.26.198", - "highlight.js": "10.6.0", - "lodash": "4.17.21", - "markdown-it": "12.0.3", - "markdown-it-anchor": "6.0.1", - "mkdirp": "1.0.4", - "rimraf": "3.0.2", - "sharp": "0.27.2", - "uslug": "1.0.4" - }, - "devDependencies": { - "@types/fancy-log": "1.3.1", - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/markdown-it": "0.0.9", - "@types/node": "13.13.45", - "@types/sharp": "0.27.1", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/packages/article/src/__tests__/__snapshots__/context.util.test.ts.snap b/packages/article/src/__tests__/__snapshots__/context.util.test.ts.snap deleted file mode 100644 index 5fad93125..000000000 --- a/packages/article/src/__tests__/__snapshots__/context.util.test.ts.snap +++ /dev/null @@ -1,144 +0,0 @@ -// Jest Snapshot v1, https://goo.gl/fbAQLP - -exports[`context.util # should create article context 1`] = ` -Object { - "categories": Array [ - "Blog", - ], - "cover": "./cover.jpg", - "created": 2019-09-10T00:00:00.000Z, - "html": "

A beautiful day

-

This is an example markdown content with all one-pass test cases.

-

Getting Started

-

Getting started snippet

-

Installing

-
yarn add -D properteis-json-loader
-
-

or, using npm

-
npm install --save-dev properties-json-loader
-
-

Update webpack configuration

-

You should use it to load as one of webpack loader configuration matching *.properties file.

-

Deep Understanding

-

References

-

- - - \\"\\" - -

-

- - - \\"\\" - -

-

- - - \\"\\" - -

-", - "id": "introducing-json-properties-loader", - "images": Array [ - "https://img.aquariuslt.com/posts/2019/08/migrating-github-actions.png", - "./a-image.jpg", - "./images/a-image.jpg", - ], - "src": " -# A beautiful day - -This is an example markdown content with all one-pass test cases. - -## Getting Started - -Getting started snippet - -### Installing - -\`\`\`shell script -yarn add -D properteis-json-loader -\`\`\` - -or, using npm - -\`\`\`shell script -npm install --save-dev properties-json-loader -\`\`\` - -### Update webpack configuration - -You should use it to load as one of webpack loader configuration matching \`*.properties\` file. - -## Deep Understanding - -## References - -![Absolute Image](https://img.aquariuslt.com/posts/2019/08/migrating-github-actions.png) - -![Relative Image](./a-image.jpg) - -![Relative Image with directory](./images/a-image.jpg) -", - "summary": "This is an example markdown content with all one-pass test cases.Getting started snippetor, using npmYou should use it to load as one of webpack loader configuration matching *.properties file. -", - "tags": Array [ - "NPM", - "Node", - "Typescript", - ], - "title": "Introducing JSON Properties Loader", - "toc": Array [ - Object { - "children": Array [], - "id": "a-beautiful-day", - "label": "A beautiful day", - "level": 1, - "pid": -1, - "position": 0, - }, - Object { - "children": Array [], - "id": "getting-started", - "label": "Getting Started", - "level": 2, - "pid": 0, - "position": 1, - }, - Object { - "children": Array [], - "id": "installing", - "label": "Installing", - "level": 3, - "pid": 1, - "position": 2, - }, - Object { - "children": Array [], - "id": "update-webpack-configuration", - "label": "Update webpack configuration", - "level": 3, - "pid": 1, - "position": 3, - }, - Object { - "children": Array [], - "id": "deep-understanding", - "label": "Deep Understanding", - "level": 2, - "pid": 0, - "position": 4, - }, - Object { - "children": Array [], - "id": "references", - "label": "References", - "level": 2, - "pid": 0, - "position": 5, - }, - ], - "updated": 2019-09-10T00:00:00.000Z, -} -`; diff --git a/packages/article/src/context.util.ts b/packages/article/src/context.util.ts deleted file mode 100644 index 1297d1b2d..000000000 --- a/packages/article/src/context.util.ts +++ /dev/null @@ -1,78 +0,0 @@ -import * as _ from 'lodash'; -import * as fs from 'fs'; -import * as hljs from 'highlight.js'; -import * as uslug from 'uslug'; -import * as MarkdownIt from 'markdown-it'; -import * as AnchorPlugin from 'markdown-it-anchor'; -import { ContentItemPlugin, ImagesDetectionPlugin, metadata, source, SummaryPlugin } from '@blog/markdown'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; - -export const createArticleContext = (filepath: string) => { - const fileContent = fs.readFileSync(filepath).toString(); - const meta = metadata(fileContent); - const src = source(fileContent); - - const md = new MarkdownIt({ - langPrefix: 'hljs ', - highlight: function(str, lang) { - if (lang && hljs.getLanguage(lang)) { - try { - return '
' + hljs.highlight(lang, str, true).value + '
'; - } catch (__) {} - } - return '
' + md.utils.escapeHtml(str) + '
'; - } - }) - .use(ImagesDetectionPlugin) - .use(ContentItemPlugin) - .use(SummaryPlugin) - .use(AnchorPlugin, { - slugify: uslug - }); - - const context = Object.assign({}, meta); - const html = md.render(src, context); - - context.src = src; - context.html = html; - - // TODO: add validation - return context; -}; - -/** get all tags as key and contexts under each tag as value */ -export const groupByArticleTags = (contexts: Partial[]) => { - const tagsMap = Object.create({}); - - _.each(contexts, (context) => { - _.each(context.tags, (tag) => { - if (tagsMap.hasOwnProperty(tag)) { - tagsMap[tag].push(context); - } else { - tagsMap[tag] = [context]; - } - }); - }); - - return tagsMap; -}; - -export const groupByArticleCategories = (contexts: Partial[]) => { - const categoriesMap = Object.create({}); - - _.each(contexts, (context) => { - _.each(context.categories, (category) => { - if (categoriesMap.hasOwnProperty(category)) { - categoriesMap[category].push(context); - } else { - categoriesMap[category] = [context]; - } - }); - }); - - return categoriesMap; -}; - -export const getAllTagsFromContexts = (contexts: ArticleContext[]) => _.keys(groupByArticleTags(contexts)); - -export const getAllCategoriesFromContexts = (contexts: ArticleContext[]) => _.keys(groupByArticleCategories(contexts)); diff --git a/packages/article/src/index.ts b/packages/article/src/index.ts deleted file mode 100644 index 0af9affc2..000000000 --- a/packages/article/src/index.ts +++ /dev/null @@ -1,3 +0,0 @@ -export * from './lookup.util'; -export * from './asset.util'; -export * from './context.util'; diff --git a/packages/article/src/lookup.util.ts b/packages/article/src/lookup.util.ts deleted file mode 100644 index a241c16db..000000000 --- a/packages/article/src/lookup.util.ts +++ /dev/null @@ -1,45 +0,0 @@ -import * as _ from 'lodash'; -import * as fs from 'fs'; -import * as glob from 'glob'; -import * as path from 'path'; -import * as url from 'url'; -import * as MarkdownIt from 'markdown-it'; -import { source, metadata } from '@blog/markdown'; -import { ImagesDetectionPlugin } from '@blog/markdown'; - -/** - * @description provide a scan function to scan all markdown files - * */ -export const lookupMarkdownFiles = (baseDir: string): string[] => { - const MD_RULES = '/**/*.md'; - return glob.sync(baseDir + MD_RULES); -}; - -/** - * @description lookup images files in markdown - * */ -export const lookupImagesInMarkdownFile = (filepath) => { - const sourceText = fs.readFileSync(filepath).toString(); - const meta = metadata(sourceText); - const src = source(sourceText); - const md = new MarkdownIt().use(ImagesDetectionPlugin); - const context = Object.create({}); - md.parse(src, context); - - const contentImages = context.images; - const coverImage = meta.cover; - - return contentImages.concat([coverImage]); -}; - -export const isImageHosting = (imageUrl: string): boolean => { - const MATCHING_PREFIXES = ['https://', 'http://', '//']; - return _.includes( - _.map(MATCHING_PREFIXES, (prefix) => _.startsWith(imageUrl, prefix)), - true - ); -}; - -export const getImageFilename = (imageUrl: string): string => { - return path.basename(url.parse(imageUrl).pathname); -}; diff --git a/packages/article/tsconfig.json b/packages/article/tsconfig.json deleted file mode 100644 index f1a0e1870..000000000 --- a/packages/article/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./" - }, - "include": ["src"], - "exclude": ["node_modules", "dist", "reports", "src/__tests__"] -} diff --git a/packages/common/README.md b/packages/common/README.md deleted file mode 100644 index 8944b0e59..000000000 --- a/packages/common/README.md +++ /dev/null @@ -1,43 +0,0 @@ -# `@blog/common` - -## Articles - -## Routes, Breadcrumbs, Navigation & Layouts - -We divide our pages into 3 main layouts: - -- List -- Detail -- Table - -Below are example of layouts - -### Layout: List - -- :domain -- :domain/posts -- :domain/categories/:category -- :domain/tags/:tag - -### Layout: Detail - -- :domain/posts/:year/:month/:day/:post-id -- :domain/pages/:page-id - -### Layout: Table - -- :domain/categories/:category -- :domain/tags/:tag - -Then breadcrumbs as navigation data will look like below: - -| layout | example url | breadcrumbs example | remark | -| ------ | ------------------------------------------------ | ------------------------------ | --------------------- | -| List | https://example.com | `Home` \| null | `Home` is i18n value | -| List | https://example.com/posts | `Home` > `Posts` \| null | `Posts` is i18n value | -| List | https://example.com/categories/blog | `Home` > `Categories` > `Blog` | | -| List | https://example.com/tags/java | `Home` > `Tags` > `Java` | | -| Detail | https://example.com/posts/2019/09/20/article-foo | `Home` > `Posts` > `Article` | | -| Detail | https://example.com/pages/about | `Home` > `Pages` > `About` | | -| Table | https://example/categories | `Home` > `Categories` | | -| Table | https://example/tags | `Home` > `Tags` | | diff --git a/packages/common/package.json b/packages/common/package.json deleted file mode 100644 index 3926d1e28..000000000 --- a/packages/common/package.json +++ /dev/null @@ -1,25 +0,0 @@ -{ - "name": "@blog/common", - "version": "6.26.198", - "description": "common instance and constants", - "author": "Aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "private": true, - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "dependencies": { - "fancy-log": "1.3.3", - "lodash": "4.17.21", - "rimraf": "3.0.2", - "schema-dts": "0.8.2" - }, - "devDependencies": { - "@types/fancy-log": "1.3.1", - "@types/jest": "26.0.20", - "@types/node": "13.13.45", - "typescript": "4.0.5" - } -} diff --git a/packages/common/interfaces/api/index.ts b/packages/common/src/interfaces/api/index.ts similarity index 100% rename from packages/common/interfaces/api/index.ts rename to packages/common/src/interfaces/api/index.ts diff --git a/packages/common/interfaces/articles/article-context.ts b/packages/common/src/interfaces/articles/article-context.ts similarity index 100% rename from packages/common/interfaces/articles/article-context.ts rename to packages/common/src/interfaces/articles/article-context.ts diff --git a/packages/common/interfaces/articles/article-metadata.ts b/packages/common/src/interfaces/articles/article-metadata.ts similarity index 100% rename from packages/common/interfaces/articles/article-metadata.ts rename to packages/common/src/interfaces/articles/article-metadata.ts diff --git a/packages/common/interfaces/articles/content-item.ts b/packages/common/src/interfaces/articles/content-item.ts similarity index 100% rename from packages/common/interfaces/articles/content-item.ts rename to packages/common/src/interfaces/articles/content-item.ts diff --git a/packages/common/interfaces/articles/index.ts b/packages/common/src/interfaces/articles/index.ts similarity index 100% rename from packages/common/interfaces/articles/index.ts rename to packages/common/src/interfaces/articles/index.ts diff --git a/packages/common/interfaces/navigation/index.ts b/packages/common/src/interfaces/navigation/index.ts similarity index 100% rename from packages/common/interfaces/navigation/index.ts rename to packages/common/src/interfaces/navigation/index.ts diff --git a/packages/common/interfaces/profile/index.ts b/packages/common/src/interfaces/profile/index.ts similarity index 92% rename from packages/common/interfaces/profile/index.ts rename to packages/common/src/interfaces/profile/index.ts index a4d1f6617..82b55ab0f 100644 --- a/packages/common/interfaces/profile/index.ts +++ b/packages/common/src/interfaces/profile/index.ts @@ -29,8 +29,8 @@ export const EmptyProfile: Profile = { email: '', logo: { '@type': 'ImageObject', - url: '', + url: '' }, name: '', - url: new Map(), + url: new Map() }; diff --git a/packages/common/interfaces/routes/breadcrumb.ts b/packages/common/src/interfaces/routes/breadcrumb.ts similarity index 100% rename from packages/common/interfaces/routes/breadcrumb.ts rename to packages/common/src/interfaces/routes/breadcrumb.ts diff --git a/packages/common/interfaces/routes/index.ts b/packages/common/src/interfaces/routes/index.ts similarity index 91% rename from packages/common/interfaces/routes/index.ts rename to packages/common/src/interfaces/routes/index.ts index 6cb989132..e5d1c6ff0 100644 --- a/packages/common/interfaces/routes/index.ts +++ b/packages/common/src/interfaces/routes/index.ts @@ -8,7 +8,7 @@ export const RoutesPathRegex = { TAGS: '/tags', TAG_DETAIL: '/tags/:id', CATEGORIES: '/categories', - CATEGORY_DETAIL: '/categories/:id', + CATEGORY_DETAIL: '/categories/:id' }; export enum RoutePathPrefix { @@ -19,7 +19,7 @@ export enum RoutePathPrefix { POSTS = 'posts', PAGES = 'pages', NAVIGATION = 'navigation', - PROFILE = 'profile', + PROFILE = 'profile' } export interface RouteMeta { @@ -54,7 +54,7 @@ export const EmptyRouteMeta: RouteMeta = { path: '', title: '', type: '', - url: '', + url: '' }; export interface Meta { @@ -79,10 +79,10 @@ export enum MetaName { OPEN_GRAPH_SITE_NAME = 'og:site_name', GOOGLE_SITE_VERIFICATION = 'google-site-verification', - GOOGLE_SITE_TRACKING = 'google-analytics', + GOOGLE_SITE_TRACKING = 'google-analytics' } export enum MetaValue { WEBSITE = 'website', - ARTICLE = 'article', + ARTICLE = 'article' } diff --git a/packages/common/interfaces/routes/layout.ts b/packages/common/src/interfaces/routes/layout.ts similarity index 76% rename from packages/common/interfaces/routes/layout.ts rename to packages/common/src/interfaces/routes/layout.ts index 72f125d7b..44389fe6c 100644 --- a/packages/common/interfaces/routes/layout.ts +++ b/packages/common/src/interfaces/routes/layout.ts @@ -1,5 +1,5 @@ export enum Layout { LIST = 'LIST', DETAIL = 'DETAIL', - TABLE = 'TABLE', + TABLE = 'TABLE' } diff --git a/packages/common/utils/path.util.ts b/packages/common/src/utils/path.util.ts similarity index 100% rename from packages/common/utils/path.util.ts rename to packages/common/src/utils/path.util.ts diff --git a/packages/common/tsconfig.json b/packages/common/tsconfig.json deleted file mode 100644 index f92bcafbc..000000000 --- a/packages/common/tsconfig.json +++ /dev/null @@ -1,10 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./", - "target": "es5" - }, - "include": ["interfaces", "utils"], - "exclude": ["node_modules"] -} diff --git a/packages/config/package.json b/packages/config/package.json deleted file mode 100644 index f58657078..000000000 --- a/packages/config/package.json +++ /dev/null @@ -1,55 +0,0 @@ -{ - "name": "@blog/config", - "version": "6.26.198", - "description": "configuration module", - "author": "aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "private": true, - "main": "dist/index.js", - "types": "dist/index.d.ts", - "files": [ - "dist" - ], - "scripts": { - "clean": "rimraf dist", - "test": "jest", - "build:lib": "tsc" - }, - "dependencies": { - "cosmiconfig": "7.0.0", - "dotenv": "8.2.0", - "lodash": "4.17.21" - }, - "devDependencies": { - "@types/dotenv": "8.2.0", - "@types/fancy-log": "1.3.1", - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/node": "12.20.4", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/packages/config/src/index.ts b/packages/config/src/index.ts deleted file mode 100644 index 4e9ba5e76..000000000 --- a/packages/config/src/index.ts +++ /dev/null @@ -1,93 +0,0 @@ -import * as path from 'path'; -import { cosmiconfigSync } from 'cosmiconfig'; - -export interface Config { - /** meta */ - location: string; - version: string; - basePath: string; - - /** content */ - sources: { - pages: string; - posts: string; - }; - - dirs: { - dest: string; - api: string; - posts: string; - pages: string; - }; - - site: { - domain: string; - baseUrl: string; - baseTitle: string; - disqus: string; // TODO: refactor as comment/feature? - googleAnalytics?: { - verification: string; - tracking: string; - }; - }; - profile; - pageOptions: { - titleSeparator: string; - }; - theme: string; -} - -/** - * @description detect theme package location by name - * @example detectThemePackage(`theme-vue`) => BASE_PATH + `/themes/theme-vue` - **/ -export const detectThemePackage = (theme: string) => { - return `/themes/${theme}`; -}; - -export const loadConfig = (id = `blog`, lookupPath?: string): Config => { - const explorer = cosmiconfigSync(id); - const configLookupResult = explorer.search(lookupPath); - const configLocation = configLookupResult.filepath; - const config = configLookupResult.config; - const basePath = path.dirname(configLocation); - - const site = config['site']; - const protocol = site.https ? 'https://' : 'http://'; - const baseUrl = protocol + site.domain; - - const pageOptions = config['pages']; - - return { - location: configLocation, - version: config['version'], - basePath: basePath, - sources: { - posts: path.join(basePath, config.sources.posts), - pages: path.join(basePath, config.sources.pages) - }, - dirs: { - dest: path.join(basePath, config.dirs.dest), - api: path.join(basePath, config.dirs.api), - posts: path.join(basePath, config.dirs.posts), - pages: path.join(basePath, config.dirs.pages) - }, - site: { - baseTitle: site.title, - baseUrl: baseUrl, - disqus: site.disqus, - domain: site.domain, - googleAnalytics: config.site['google_analytics'] - ? { - verification: config.site['google_analytics']['verification'], - tracking: config.site['google_analytics']['tracking'] - } - : undefined - }, - profile: config.profile, - theme: path.join(basePath, detectThemePackage(config.theme)), - pageOptions: { - titleSeparator: pageOptions['title_separator'] - } - }; -}; diff --git a/packages/config/tsconfig.json b/packages/config/tsconfig.json deleted file mode 100644 index f1a0e1870..000000000 --- a/packages/config/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./" - }, - "include": ["src"], - "exclude": ["node_modules", "dist", "reports", "src/__tests__"] -} diff --git a/packages/markdown/README.md b/packages/markdown/README.md deleted file mode 100644 index 4995d135c..000000000 --- a/packages/markdown/README.md +++ /dev/null @@ -1,38 +0,0 @@ -# `@blog/markdown` - -Including a set of markdown processor base on `markdown-it` and `gray-matter` - -## Conduct - -The target is to read and parse data from markdown file as article. So the input params is a markdown file (file content / the absolute path with filename), and the output should be parsing data, including: - -- Metadata: describe the metadata of this article, like title, created, updated, category, tags, permalink, cover images... -- Source Text: the markdown source content without `YFM` -- Html Content: the rendered html content (render after markdown plugin process chains) -- Medias: the medias in article, images (with srcset urls), videos -- Summary: the summary of article in shortcut - -## Metadata Definition - -In `v5.x`, I've built a process flow with markdown chains. - -In `v6.x`, the target is to process article with more standard way. - -There are some google recommended standard formatting guides, which inspired me: - -- Matching `structued-data` requirement by [Google Data Types Definitions](https://developers.google.com/search/docs/data-types/article), -- Matching `amp-story` by [AMP Project](https://amp.dev/documentation/components/amp-story/?referrer=ampproject.org) - -### Metadata Changes - -> Metadata type add/update compare to `v5.x` - -#### Updated Metadata - -- Update `caregroy: string` to `categories: string[]`, this behavior will show like [Front Matter - Hexo](https://hexo.io/zh-cn/docs/front-matter) - -#### New Metadata - -- Add `id: string` instead of filename detecting (to simplify markdown processing chain, no need to add filename detection) -- Add `description: string` as html metas -- Add optional attribute `layout`, we can maintain a different layout for each page (like should show comments) diff --git a/packages/markdown/package.json b/packages/markdown/package.json deleted file mode 100644 index dc7373248..000000000 --- a/packages/markdown/package.json +++ /dev/null @@ -1,64 +0,0 @@ -{ - "name": "@blog/markdown", - "version": "6.26.198", - "description": "markdown utils for parsing blog articles", - "author": "Aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "private": true, - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "main": "dist/index.js", - "types": "dist/index.d.ts", - "files": [ - "dist" - ], - "scripts": { - "clean": "rimraf dist", - "test": "jest", - "build:lib": "tsc" - }, - "dependencies": { - "@blog/common": "^6.26.198", - "cheerio": "1.0.0-rc.5", - "gray-matter": "4.0.2", - "lodash": "4.17.21", - "markdown-it": "12.0.3", - "markdown-it-anchor": "6.0.1", - "uslug": "1.0.4" - }, - "devDependencies": { - "@types/cheerio": "0.22.24", - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/markdown-it": "0.0.9", - "@types/node": "13.13.45", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "rimraf": "3.0.2", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/packages/markdown/src/__tests__/__fixtures__/sample-article.md b/packages/markdown/src/__tests__/__fixtures__/sample-article.md deleted file mode 100644 index 372d3703c..000000000 --- a/packages/markdown/src/__tests__/__fixtures__/sample-article.md +++ /dev/null @@ -1,45 +0,0 @@ ---- -title: 'Introducing JSON Properties Loader' -created: 2019-09-10 -updated: 2019-09-10 -permalink: introducing-json-properties-loader -categories: - - Blog -tags: - - NPM - - Node - - Typescript ---- - -# A beautiful day - -This is an example markdown content with all one-pass test cases. - -This section should load as summary in parsed data. - -## Getting Started - -Getting started snippet - -### Installing - -```shell script -yarn add -D properteis-json-loader -``` - -or, using npm - -```shell script -npm install --save-dev properties-json-loader -``` - -### Update webpack configuration - -You should use it to load as one of webpack loader configuration matching `*.properties` file. - -## Deep Understanding - -## References - -![Absolute Image](https://img.aquariuslt.com/posts/2019/08/migrating-github-actions.png) -![Relative Image](./images/sample-image.png) diff --git a/packages/markdown/src/__tests__/__snapshots__/pre-process.test.ts.snap b/packages/markdown/src/__tests__/__snapshots__/pre-process.test.ts.snap deleted file mode 100644 index ef60a9dd8..000000000 --- a/packages/markdown/src/__tests__/__snapshots__/pre-process.test.ts.snap +++ /dev/null @@ -1,55 +0,0 @@ -// Jest Snapshot v1, https://goo.gl/fbAQLP - -exports[`markdown: preprocess # should read file content in pre-process steps 1`] = ` -Object { - "categories": Array [ - "Blog", - ], - "created": 2019-09-10T00:00:00.000Z, - "permalink": "introducing-json-properties-loader", - "tags": Array [ - "NPM", - "Node", - "Typescript", - ], - "title": "Introducing JSON Properties Loader", - "updated": 2019-09-10T00:00:00.000Z, -} -`; - -exports[`markdown: preprocess # should read file content in pre-process steps 2`] = ` -" -# A beautiful day - -This is an example markdown content with all one-pass test cases. - -This section should load as summary in parsed data. - -## Getting Started - -Getting started snippet - -### Installing - -\`\`\`shell script -yarn add -D properteis-json-loader -\`\`\` - -or, using npm - -\`\`\`shell script -npm install --save-dev properties-json-loader -\`\`\` - -### Update webpack configuration - -You should use it to load as one of webpack loader configuration matching \`*.properties\` file. - -## Deep Understanding - -## References - -![Absolute Image](https://img.aquariuslt.com/posts/2019/08/migrating-github-actions.png) -![Relative Image](./images/sample-image.png) -" -`; diff --git a/packages/markdown/src/__tests__/__snapshots__/summary.plugin.test.ts.snap b/packages/markdown/src/__tests__/__snapshots__/summary.plugin.test.ts.snap deleted file mode 100644 index c6ae1faf6..000000000 --- a/packages/markdown/src/__tests__/__snapshots__/summary.plugin.test.ts.snap +++ /dev/null @@ -1,6 +0,0 @@ -// Jest Snapshot v1, https://goo.gl/fbAQLP - -exports[`markdown-it plugin: summary # should get summary with more than 120 chars without any options 1`] = ` -"This is an example markdown content with all one-pass test cases.This section should load as summary in parsed data.Getting started snippet -" -`; diff --git a/packages/markdown/src/__tests__/content-item.plugin.test.ts b/packages/markdown/src/__tests__/content-item.plugin.test.ts deleted file mode 100644 index 664bb1d48..000000000 --- a/packages/markdown/src/__tests__/content-item.plugin.test.ts +++ /dev/null @@ -1,34 +0,0 @@ -import * as MarkdownIt from 'markdown-it'; -import { ContentItemPlugin } from '../index'; - -import { read } from '../__tests__/test.fixtures.helper'; - -describe('markdown-it plugin: content items', (): void => { - it('# should get toc data at env', (): void => { - const md = MarkdownIt().use(ContentItemPlugin); - const raw = read(`sample-article.md`); - const context = {}; - - md.parse(raw, context); - expect(context).toHaveProperty('toc'); - }); - - it('# should have no toc when no context', (): void => { - const md = MarkdownIt().use(ContentItemPlugin); - const raw = read(`sample-article.md`); - md.parse(raw, undefined); - }); - - it('# should detect level 2 contents', (): void => { - const md = MarkdownIt().use(ContentItemPlugin); - const raw = read(`sample-article.md`); - const context = {}; - - md.parse(raw, context); - - const toc = context['toc']; - - expect(toc[0]).toHaveProperty('children'); - expect(toc[0].children).toHaveLength(0); - }); -}); diff --git a/packages/markdown/src/__tests__/images.plugin.test.ts b/packages/markdown/src/__tests__/images.plugin.test.ts deleted file mode 100644 index 076a39b60..000000000 --- a/packages/markdown/src/__tests__/images.plugin.test.ts +++ /dev/null @@ -1,22 +0,0 @@ -import * as MarkdownIt from 'markdown-it'; -import { ImagesDetectionPlugin } from '../index'; -import { read } from '../__tests__/test.fixtures.helper'; - -describe('markdown-it plugins: images', (): void => { - it('# should detect images at env', (): void => { - const md = MarkdownIt().use(ImagesDetectionPlugin); - const raw = read(`sample-article.md`); - const context = {}; - md.parse(raw, context); - - expect(context).toHaveProperty('images'); - expect(context['images'].length).toBe(2); - expect(context['images']).toMatchSnapshot(); - }); - - it('# should detect no images when no env', (): void => { - const md = MarkdownIt().use(ImagesDetectionPlugin); - const raw = read(`sample-article.md`); - md.parse(raw, undefined); - }); -}); diff --git a/packages/markdown/src/__tests__/pre-process.test.ts b/packages/markdown/src/__tests__/pre-process.test.ts deleted file mode 100644 index 2ecef26cc..000000000 --- a/packages/markdown/src/__tests__/pre-process.test.ts +++ /dev/null @@ -1,18 +0,0 @@ -import { read } from '../__tests__/test.fixtures.helper'; -import { metadata, source } from '../index'; - -describe('markdown: preprocess', () => { - it('# should read file content in pre-process steps', () => { - const raw = read(`sample-article.md`); - const meta = metadata(raw); - const src = source(raw); - - expect(meta).not.toBeUndefined(); - expect(meta).toHaveProperty('title'); - expect(meta).toHaveProperty('created'); - expect(src).not.toBeUndefined(); - - expect(meta).toMatchSnapshot(); - expect(src).toMatchSnapshot(); - }); -}); diff --git a/packages/markdown/src/__tests__/summary.plugin.test.ts b/packages/markdown/src/__tests__/summary.plugin.test.ts deleted file mode 100644 index 3843d5e07..000000000 --- a/packages/markdown/src/__tests__/summary.plugin.test.ts +++ /dev/null @@ -1,36 +0,0 @@ -import * as MarkdownIt from 'markdown-it'; -import { SummaryPlugin } from '../summary.plugin'; -import { source } from '../index'; -import { read } from '../__tests__/test.fixtures.helper'; - -describe('markdown-it plugin: summary', () => { - it('# should get summary with more than 120 chars without any options', () => { - const md = MarkdownIt().use(SummaryPlugin); - const raw = source(read(`sample-article.md`)); - const context = {}; - - md.parse(raw, context); - expect(context).toHaveProperty('summary'); - expect(context['summary'].length).toBeGreaterThanOrEqual(120); - expect(context['summary']).toMatchSnapshot(); - }); - - it('# should get summary with spec chars with options.len', () => { - const options = { - len: 50 - }; - const md = MarkdownIt().use(SummaryPlugin, options); - const raw = source(read(`sample-article.md`)); - const context = {}; - - md.parse(raw, context); - expect(context).toHaveProperty('summary'); - expect(context['summary'].length).toBeGreaterThanOrEqual(options.len); - }); - - it('# should get no summary when no context', () => { - const md = MarkdownIt().use(SummaryPlugin); - const raw = source(read(`sample-article.md`)); - md.parse(raw, undefined); - }); -}); diff --git a/packages/markdown/src/content-item.plugin.ts b/packages/markdown/src/content-item.plugin.ts deleted file mode 100644 index 71e2625b2..000000000 --- a/packages/markdown/src/content-item.plugin.ts +++ /dev/null @@ -1,69 +0,0 @@ -import * as uslug from 'uslug'; -import * as MarkdownIt from 'markdown-it'; -import { ContentItem } from '@blog/common/interfaces/articles/content-item'; - -const level = (tag: string) => Number(tag.slice(1)); - -/** - * @desc set pid for headingItems - * @param headingItems: origin heading data - */ -const collapseHeading = (headingItems: ContentItem[]): ContentItem[] => { - const ROOT_PID = -1; - - // 1. fill pid. - headingItems.map((headingItem, index): void => { - if (headingItem.level === 1) { - headingItem.pid = ROOT_PID; - } else if (index !== 0 && headingItem.level < headingItems[index - 1].level) { - for (let i = index - 1; i > 0; i--) { - if (headingItem.level === headingItems[i].level) { - headingItem.pid = headingItems[i].pid; - break; - } - } - } else if (index !== 0 && headingItem.level > headingItems[index - 1].level) { - headingItem.pid = headingItems[index - 1].position; - } else { - headingItem.pid = headingItems[index - 1].pid; - } - }); - - return headingItems; -}; - -export const ContentItemPlugin = (md: MarkdownIt) => { - let shadowState; - - md.core.ruler.push('shadow_state', (state) => { - shadowState = state; - }); - - md.core.ruler.after('shadow_state', 'content_item', (state) => { - const shadowTokens = shadowState.tokens; - - let headingItems: ContentItem[] = []; - let headingPosition = 0; - shadowTokens.map((token, index): void => { - if (token.type === 'heading_close') { - const headingContent = shadowTokens[index - 1].content; - const headingLevel = level(token.tag); - const headingId = uslug(headingContent); - - headingItems.push({ - label: headingContent, - level: headingLevel, - id: headingId, - position: headingPosition++, - children: [] - }); - } - }); - - headingItems = collapseHeading(headingItems); - - if (state.env) { - state.env.toc = headingItems; - } - }); -}; diff --git a/packages/markdown/src/images.plugin.ts b/packages/markdown/src/images.plugin.ts deleted file mode 100644 index be2bc17ef..000000000 --- a/packages/markdown/src/images.plugin.ts +++ /dev/null @@ -1,48 +0,0 @@ -import * as _ from 'lodash'; -import * as MarkdownIt from 'markdown-it'; - -const PNG_EXTENSION = '.png'; -const WEBP_EXTENSION = '.webp'; - -const LAZY_IMAGE_PLACEHOLDER = - 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABkAAAAJWAQMAAAA6AtlxAAAAA1BMVEX///+nxBvIAAAAi0lEQVR42u3BAQ0AAADCoPdPbQ8HFAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA/wbVlQABttAC+QAAAABJRU5ErkJggg=='; - -export const ImagesDetectionPlugin = (md: MarkdownIt) => { - md.core.ruler.push('detect_images', (state): void => { - const tokens = state.tokens; - const images: string[] = []; - - tokens.map((token): void => { - if (token.type === 'inline') { - token.children.map((childToken): void => { - if (childToken.type === 'image') { - childToken.attrs.map((imageAttr): void => { - if (_.isArray(imageAttr) && imageAttr.length > 1 && imageAttr[0] === 'src') { - const imageUrl = imageAttr[1]; - images.push(imageUrl); - } - }); - } - }); - } - }); - - if (state.env) { - state.env.images = images; - } - }); - - md.renderer.rules.image = (tokens, idx, options, env, self) => { - const token = tokens[idx]; - const src = token.attrGet('src'); - const alt = token.attrGet('alt'); - const optimizedWebpSrc = _.replace(src, PNG_EXTENSION, WEBP_EXTENSION); - - return ` - - - ${alt} - - `; - }; -}; diff --git a/packages/markdown/src/index.ts b/packages/markdown/src/index.ts deleted file mode 100644 index 71c1b9cae..000000000 --- a/packages/markdown/src/index.ts +++ /dev/null @@ -1,4 +0,0 @@ -export * from './metadata'; -export * from './images.plugin'; -export * from './summary.plugin'; -export * from './content-item.plugin'; diff --git a/packages/markdown/src/summary.plugin.ts b/packages/markdown/src/summary.plugin.ts deleted file mode 100644 index 1823bf637..000000000 --- a/packages/markdown/src/summary.plugin.ts +++ /dev/null @@ -1,38 +0,0 @@ -import * as MarkdownIt from 'markdown-it'; -import * as cheerio from 'cheerio'; - -const DEFAULT_SUMMARY_LENGTH = 120; - -export const SummaryPlugin = (md: MarkdownIt, options?) => { - md.core.ruler.push('detect_summary', (state): void => { - let summarySourceText = ''; - let summaryLength = DEFAULT_SUMMARY_LENGTH; - - if (options && options.len) { - summaryLength = options.len; - } - - const tokens = state.tokens; - - tokens.forEach((token, index): void => { - if ( - index > 0 && - token.type === 'inline' && - tokens[index - 1].type === 'paragraph_open' && - token.content.charAt(0) != '#' - ) { - if (summarySourceText.length < summaryLength) { - summarySourceText += token.content; - } - } - }); - - const summaryHtml = new MarkdownIt().render(summarySourceText); - const $ = cheerio.load(summaryHtml); - const summary = $.root().text(); - - if (state.env) { - state.env.summary = summary; - } - }); -}; diff --git a/packages/markdown/tsconfig.json b/packages/markdown/tsconfig.json deleted file mode 100644 index f1a0e1870..000000000 --- a/packages/markdown/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./" - }, - "include": ["src"], - "exclude": ["node_modules", "dist", "reports", "src/__tests__"] -} diff --git a/packages/pwa/package.json b/packages/pwa/package.json deleted file mode 100644 index f7dffe4c6..000000000 --- a/packages/pwa/package.json +++ /dev/null @@ -1,67 +0,0 @@ -{ - "name": "@blog/pwa", - "version": "6.26.198", - "description": "progressive web application tools including workbox + precaching", - "keywords": [ - "pwa", - "workbox", - "prerender" - ], - "author": "aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "main": "dist/index.js", - "types": "dist/index.d.ts", - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "private": true, - "files": [ - "dist" - ], - "scripts": { - "clean": "rimraf dist", - "test": "jest", - "build:lib": "tsc", - "patch": "patch-package", - "postinstall": "patch-package" - }, - "dependencies": { - "@blog/common": "^6.26.198", - "lodash": "4.17.21", - "patch-package": "6.4.5", - "postinstall-postinstall": "2.1.0", - "workbox-build": "4.3.1" - }, - "devDependencies": { - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/node": "13.13.45", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "rimraf": "3.0.2", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/packages/pwa/patches/workbox-precaching+4.3.1.patch b/packages/pwa/patches/workbox-precaching+4.3.1.patch deleted file mode 100644 index 3ab369967..000000000 --- a/packages/pwa/patches/workbox-precaching+4.3.1.patch +++ /dev/null @@ -1,69 +0,0 @@ -diff --git a/node_modules/workbox-precaching/PrecacheController.mjs b/node_modules/workbox-precaching/PrecacheController.mjs -index 2def75d..1ce85b2 100644 ---- a/node_modules/workbox-precaching/PrecacheController.mjs -+++ b/node_modules/workbox-precaching/PrecacheController.mjs -@@ -169,6 +169,7 @@ class PrecacheController { - event, - plugins, - request, -+ fetchOptions: { importance: 'low'}, - }); - - // Allow developers to override the default logic about what is and isn't -diff --git a/node_modules/workbox-precaching/build/workbox-precaching.dev.js b/node_modules/workbox-precaching/build/workbox-precaching.dev.js -index fb69211..f8b5cc0 100644 ---- a/node_modules/workbox-precaching/build/workbox-precaching.dev.js -+++ b/node_modules/workbox-precaching/build/workbox-precaching.dev.js -@@ -418,7 +418,10 @@ this.workbox.precaching = (function (exports, assert_mjs, cacheNames_mjs, getFri - let response = await fetchWrapper_mjs.fetchWrapper.fetch({ - event, - plugins, -- request -+ request, -+ fetchOptions: { -+ importance: 'low' -+ } - }); // Allow developers to override the default logic about what is and isn't - // valid by passing in a plugin implementing cacheWillUpdate(), e.g. - // a workbox.cacheableResponse.Plugin instance. -@@ -975,13 +978,13 @@ this.workbox.precaching = (function (exports, assert_mjs, cacheNames_mjs, getFri - assert_mjs.assert.isSWEnv('workbox-precaching'); - } - -+ exports.PrecacheController = PrecacheController; - exports.addPlugins = addPlugins; - exports.addRoute = addRoute; - exports.cleanupOutdatedCaches = cleanupOutdatedCaches; - exports.getCacheKeyForURL = getCacheKeyForURL$1; - exports.precache = precache; - exports.precacheAndRoute = precacheAndRoute; -- exports.PrecacheController = PrecacheController; - - return exports; - -diff --git a/node_modules/workbox-precaching/build/workbox-precaching.dev.js.map b/node_modules/workbox-precaching/build/workbox-precaching.dev.js.map -index 364f81a..bdb4c65 100644 ---- a/node_modules/workbox-precaching/build/workbox-precaching.dev.js.map -+++ b/node_modules/workbox-precaching/build/workbox-precaching.dev.js.map -@@ -1 +1 @@ --{"version":3,"file":"workbox-precaching.dev.js","sources":["../_version.mjs","../utils/precachePlugins.mjs","../addPlugins.mjs","../utils/cleanRedirect.mjs","../utils/createCacheKey.mjs","../utils/printCleanupDetails.mjs","../utils/printInstallDetails.mjs","../PrecacheController.mjs","../utils/getOrCreatePrecacheController.mjs","../utils/removeIgnoredSearchParams.mjs","../utils/generateURLVariations.mjs","../utils/getCacheKeyForURL.mjs","../utils/addFetchListener.mjs","../addRoute.mjs","../utils/deleteOutdatedCaches.mjs","../cleanupOutdatedCaches.mjs","../getCacheKeyForURL.mjs","../precache.mjs","../precacheAndRoute.mjs","../index.mjs"],"sourcesContent":["try{self['workbox:precaching:4.3.1']&&_()}catch(e){}// eslint-disable-line","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n\nconst plugins = [];\n\nexport const precachePlugins = {\n /*\n * @return {Array}\n * @private\n */\n get() {\n return plugins;\n },\n\n /*\n * @param {Array} newPlugins\n * @private\n */\n add(newPlugins) {\n plugins.push(...newPlugins);\n },\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds plugins to precaching.\n *\n * @param {Array} newPlugins\n *\n * @alias workbox.precaching.addPlugins\n */\nconst addPlugins = (newPlugins) => {\n precachePlugins.add(newPlugins);\n};\n\nexport {addPlugins};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * @param {Response} response\n * @return {Response}\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport async function cleanRedirect(response) {\n const clonedResponse = response.clone();\n\n // Not all browsers support the Response.body stream, so fall back\n // to reading the entire body into memory as a blob.\n const bodyPromise = 'body' in clonedResponse ?\n Promise.resolve(clonedResponse.body) :\n clonedResponse.blob();\n\n const body = await bodyPromise;\n\n // new Response() is happy when passed either a stream or a Blob.\n return new Response(body, {\n headers: clonedResponse.headers,\n status: clonedResponse.status,\n statusText: clonedResponse.statusText,\n });\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport '../_version.mjs';\n\n// Name of the search parameter used to store revision info.\nconst REVISION_SEARCH_PARAM = '__WB_REVISION__';\n\n/**\n * Converts a manifest entry into a versioned URL suitable for precaching.\n *\n * @param {Object} entry\n * @return {string} A URL with versioning info.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function createCacheKey(entry) {\n if (!entry) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If a precache manifest entry is a string, it's assumed to be a versioned\n // URL, like '/app.abcd1234.js'. Return as-is.\n if (typeof entry === 'string') {\n const urlObject = new URL(entry, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n const {revision, url} = entry;\n if (!url) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If there's just a URL and no revision, then it's also assumed to be a\n // versioned URL.\n if (!revision) {\n const urlObject = new URL(url, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n // Otherwise, construct a properly versioned URL using the custom Workbox\n // search parameter along with the revision info.\n const originalURL = new URL(url, location);\n const cacheKeyURL = new URL(url, location);\n cacheKeyURL.searchParams.set(REVISION_SEARCH_PARAM, revision);\n return {\n cacheKey: cacheKeyURL.href,\n url: originalURL.href,\n };\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\n\nimport '../_version.mjs';\n\nconst logGroup = (groupTitle, deletedURLs) => {\n logger.groupCollapsed(groupTitle);\n\n for (const url of deletedURLs) {\n logger.log(url);\n }\n\n logger.groupEnd();\n};\n\n/**\n * @param {Array} deletedURLs\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function printCleanupDetails(deletedURLs) {\n const deletionCount = deletedURLs.length;\n if (deletionCount > 0) {\n logger.groupCollapsed(`During precaching cleanup, ` +\n `${deletionCount} cached ` +\n `request${deletionCount === 1 ? ' was' : 's were'} deleted.`);\n logGroup('Deleted Cache Requests', deletedURLs);\n logger.groupEnd();\n }\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\n\nimport '../_version.mjs';\n\n/**\n * @param {string} groupTitle\n * @param {Array} urls\n *\n * @private\n */\nfunction _nestedGroup(groupTitle, urls) {\n if (urls.length === 0) {\n return;\n }\n\n logger.groupCollapsed(groupTitle);\n\n for (const url of urls) {\n logger.log(url);\n }\n\n logger.groupEnd();\n}\n\n/**\n * @param {Array} urlsToPrecache\n * @param {Array} urlsAlreadyPrecached\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function printInstallDetails(urlsToPrecache, urlsAlreadyPrecached) {\n const precachedCount = urlsToPrecache.length;\n const alreadyPrecachedCount = urlsAlreadyPrecached.length;\n\n if (precachedCount || alreadyPrecachedCount) {\n let message =\n `Precaching ${precachedCount} file${precachedCount === 1 ? '' : 's'}.`;\n\n if (alreadyPrecachedCount > 0) {\n message += ` ${alreadyPrecachedCount} ` +\n `file${alreadyPrecachedCount === 1 ? ' is' : 's are'} already cached.`;\n }\n\n logger.groupCollapsed(message);\n\n _nestedGroup(`View newly precached URLs.`, urlsToPrecache);\n _nestedGroup(`View previously precached URLs.`, urlsAlreadyPrecached);\n logger.groupEnd();\n }\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {assert} from 'workbox-core/_private/assert.mjs';\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {cacheWrapper} from 'workbox-core/_private/cacheWrapper.mjs';\nimport {fetchWrapper} from 'workbox-core/_private/fetchWrapper.mjs';\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport {cleanRedirect} from './utils/cleanRedirect.mjs';\nimport {createCacheKey} from './utils/createCacheKey.mjs';\nimport {printCleanupDetails} from './utils/printCleanupDetails.mjs';\nimport {printInstallDetails} from './utils/printInstallDetails.mjs';\n\nimport './_version.mjs';\n\n\n/**\n * Performs efficient precaching of assets.\n *\n * @memberof module:workbox-precaching\n */\nclass PrecacheController {\n /**\n * Create a new PrecacheController.\n *\n * @param {string} [cacheName] An optional name for the cache, to override\n * the default precache name.\n */\n constructor(cacheName) {\n this._cacheName = cacheNames.getPrecacheName(cacheName);\n this._urlsToCacheKeys = new Map();\n }\n\n /**\n * This method will add items to the precache list, removing duplicates\n * and ensuring the information is valid.\n *\n * @param {\n * Array\n * } entries Array of entries to precache.\n */\n addToCacheList(entries) {\n if (process.env.NODE_ENV !== 'production') {\n assert.isArray(entries, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'addToCacheList',\n paramName: 'entries',\n });\n }\n\n for (const entry of entries) {\n const {cacheKey, url} = createCacheKey(entry);\n if (this._urlsToCacheKeys.has(url) &&\n this._urlsToCacheKeys.get(url) !== cacheKey) {\n throw new WorkboxError('add-to-cache-list-conflicting-entries', {\n firstEntry: this._urlsToCacheKeys.get(url),\n secondEntry: cacheKey,\n });\n }\n this._urlsToCacheKeys.set(url, cacheKey);\n }\n }\n\n /**\n * Precaches new and updated assets. Call this method from the service worker\n * install event.\n *\n * @param {Object} options\n * @param {Event} [options.event] The install event (if needed).\n * @param {Array} [options.plugins] Plugins to be used for fetching\n * and caching during install.\n * @return {Promise}\n */\n async install({event, plugins} = {}) {\n if (process.env.NODE_ENV !== 'production') {\n if (plugins) {\n assert.isArray(plugins, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'install',\n paramName: 'plugins',\n });\n }\n }\n\n const urlsToPrecache = [];\n const urlsAlreadyPrecached = [];\n\n const cache = await caches.open(this._cacheName);\n const alreadyCachedRequests = await cache.keys();\n const alreadyCachedURLs = new Set(alreadyCachedRequests.map(\n (request) => request.url));\n\n for (const cacheKey of this._urlsToCacheKeys.values()) {\n if (alreadyCachedURLs.has(cacheKey)) {\n urlsAlreadyPrecached.push(cacheKey);\n } else {\n urlsToPrecache.push(cacheKey);\n }\n }\n\n const precacheRequests = urlsToPrecache.map((url) => {\n return this._addURLToCache({event, plugins, url});\n });\n await Promise.all(precacheRequests);\n\n if (process.env.NODE_ENV !== 'production') {\n printInstallDetails(urlsToPrecache, urlsAlreadyPrecached);\n }\n\n return {\n updatedURLs: urlsToPrecache,\n notUpdatedURLs: urlsAlreadyPrecached,\n };\n }\n\n /**\n * Deletes assets that are no longer present in the current precache manifest.\n * Call this method from the service worker activate event.\n *\n * @return {Promise}\n */\n async activate() {\n const cache = await caches.open(this._cacheName);\n const currentlyCachedRequests = await cache.keys();\n const expectedCacheKeys = new Set(this._urlsToCacheKeys.values());\n\n const deletedURLs = [];\n for (const request of currentlyCachedRequests) {\n if (!expectedCacheKeys.has(request.url)) {\n await cache.delete(request);\n deletedURLs.push(request.url);\n }\n }\n\n if (process.env.NODE_ENV !== 'production') {\n printCleanupDetails(deletedURLs);\n }\n\n return {deletedURLs};\n }\n\n /**\n * Requests the entry and saves it to the cache if the response is valid.\n * By default, any response with a status code of less than 400 (including\n * opaque responses) is considered valid.\n *\n * If you need to use custom criteria to determine what's valid and what\n * isn't, then pass in an item in `options.plugins` that implements the\n * `cacheWillUpdate()` lifecycle event.\n *\n * @private\n * @param {Object} options\n * @param {string} options.url The URL to fetch and cache.\n * @param {Event} [options.event] The install event (if passed).\n * @param {Array} [options.plugins] An array of plugins to apply to\n * fetch and caching.\n */\n async _addURLToCache({url, event, plugins}) {\n const request = new Request(url, {credentials: 'same-origin'});\n let response = await fetchWrapper.fetch({\n event,\n plugins,\n request,\n });\n\n // Allow developers to override the default logic about what is and isn't\n // valid by passing in a plugin implementing cacheWillUpdate(), e.g.\n // a workbox.cacheableResponse.Plugin instance.\n let cacheWillUpdateCallback;\n for (const plugin of (plugins || [])) {\n if ('cacheWillUpdate' in plugin) {\n cacheWillUpdateCallback = plugin.cacheWillUpdate.bind(plugin);\n }\n }\n\n const isValidResponse = cacheWillUpdateCallback ?\n // Use a callback if provided. It returns a truthy value if valid.\n cacheWillUpdateCallback({event, request, response}) :\n // Otherwise, default to considering any response status under 400 valid.\n // This includes, by default, considering opaque responses valid.\n response.status < 400;\n\n // Consider this a failure, leading to the `install` handler failing, if\n // we get back an invalid response.\n if (!isValidResponse) {\n throw new WorkboxError('bad-precaching-response', {\n url,\n status: response.status,\n });\n }\n\n if (response.redirected) {\n response = await cleanRedirect(response);\n }\n\n await cacheWrapper.put({\n event,\n plugins,\n request,\n response,\n cacheName: this._cacheName,\n matchOptions: {\n ignoreSearch: true,\n },\n });\n }\n\n /**\n * Returns a mapping of a precached URL to the corresponding cache key, taking\n * into account the revision information for the URL.\n *\n * @return {Map} A URL to cache key mapping.\n */\n getURLsToCacheKeys() {\n return this._urlsToCacheKeys;\n }\n\n /**\n * Returns a list of all the URLs that have been precached by the current\n * service worker.\n *\n * @return {Array} The precached URLs.\n */\n getCachedURLs() {\n return [...this._urlsToCacheKeys.keys()];\n }\n\n /**\n * Returns the cache key used for storing a given URL. If that URL is\n * unversioned, like `/index.html', then the cache key will be the original\n * URL with a search parameter appended to it.\n *\n * @param {string} url A URL whose cache key you want to look up.\n * @return {string} The versioned URL that corresponds to a cache key\n * for the original URL, or undefined if that URL isn't precached.\n */\n getCacheKeyForURL(url) {\n const urlObject = new URL(url, location);\n return this._urlsToCacheKeys.get(urlObject.href);\n }\n}\n\nexport {PrecacheController};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {PrecacheController} from '../PrecacheController.mjs';\nimport '../_version.mjs';\n\n\nlet precacheController;\n\n/**\n * @return {PrecacheController}\n * @private\n */\nexport const getOrCreatePrecacheController = () => {\n if (!precacheController) {\n precacheController = new PrecacheController();\n }\n return precacheController;\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * Removes any URL search parameters that should be ignored.\n *\n * @param {URL} urlObject The original URL.\n * @param {Array} ignoreURLParametersMatching RegExps to test against\n * each search parameter name. Matches mean that the search parameter should be\n * ignored.\n * @return {URL} The URL with any ignored search parameters removed.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function removeIgnoredSearchParams(urlObject,\n ignoreURLParametersMatching) {\n // Convert the iterable into an array at the start of the loop to make sure\n // deletion doesn't mess up iteration.\n for (const paramName of [...urlObject.searchParams.keys()]) {\n if (ignoreURLParametersMatching.some((regExp) => regExp.test(paramName))) {\n urlObject.searchParams.delete(paramName);\n }\n }\n\n return urlObject;\n}\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {removeIgnoredSearchParams} from './removeIgnoredSearchParams.mjs';\n\nimport '../_version.mjs';\n\n/**\n * Generator function that yields possible variations on the original URL to\n * check, one at a time.\n *\n * @param {string} url\n * @param {Object} options\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function* generateURLVariations(url, {\n ignoreURLParametersMatching,\n directoryIndex,\n cleanURLs,\n urlManipulation,\n} = {}) {\n const urlObject = new URL(url, location);\n urlObject.hash = '';\n yield urlObject.href;\n\n const urlWithoutIgnoredParams = removeIgnoredSearchParams(\n urlObject, ignoreURLParametersMatching);\n yield urlWithoutIgnoredParams.href;\n\n if (directoryIndex && urlWithoutIgnoredParams.pathname.endsWith('/')) {\n const directoryURL = new URL(urlWithoutIgnoredParams);\n directoryURL.pathname += directoryIndex;\n yield directoryURL.href;\n }\n\n if (cleanURLs) {\n const cleanURL = new URL(urlWithoutIgnoredParams);\n cleanURL.pathname += '.html';\n yield cleanURL.href;\n }\n\n if (urlManipulation) {\n const additionalURLs = urlManipulation({url: urlObject});\n for (const urlToAttempt of additionalURLs) {\n yield urlToAttempt.href;\n }\n }\n}\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './getOrCreatePrecacheController.mjs';\nimport {generateURLVariations} from './generateURLVariations.mjs';\nimport '../_version.mjs';\n\n/**\n * This function will take the request URL and manipulate it based on the\n * configuration options.\n *\n * @param {string} url\n * @param {Object} options\n * @return {string} Returns the URL in the cache that matches the request,\n * if possible.\n *\n * @private\n */\nexport const getCacheKeyForURL = (url, options) => {\n const precacheController = getOrCreatePrecacheController();\n\n const urlsToCacheKeys = precacheController.getURLsToCacheKeys();\n for (const possibleURL of generateURLVariations(url, options)) {\n const possibleCacheKey = urlsToCacheKeys.get(possibleURL);\n if (possibleCacheKey) {\n return possibleCacheKey;\n }\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {getFriendlyURL} from 'workbox-core/_private/getFriendlyURL.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getCacheKeyForURL} from './getCacheKeyForURL.mjs';\nimport '../_version.mjs';\n\n\n/**\n * Adds a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * NOTE: when called more than once this method will replace the previously set\n * configuration options. Calling it more than once is not recommended outside\n * of tests.\n *\n * @private\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n */\nexport const addFetchListener = ({\n ignoreURLParametersMatching = [/^utm_/],\n directoryIndex = 'index.html',\n cleanURLs = true,\n urlManipulation = null,\n} = {}) => {\n const cacheName = cacheNames.getPrecacheName();\n\n addEventListener('fetch', (event) => {\n const precachedURL = getCacheKeyForURL(event.request.url, {\n cleanURLs,\n directoryIndex,\n ignoreURLParametersMatching,\n urlManipulation,\n });\n if (!precachedURL) {\n if (process.env.NODE_ENV !== 'production') {\n logger.debug(`Precaching did not find a match for ` +\n getFriendlyURL(event.request.url));\n }\n return;\n }\n\n let responsePromise = caches.open(cacheName).then((cache) => {\n return cache.match(precachedURL);\n }).then((cachedResponse) => {\n if (cachedResponse) {\n return cachedResponse;\n }\n\n // Fall back to the network if we don't have a cached response\n // (perhaps due to manual cache cleanup).\n if (process.env.NODE_ENV !== 'production') {\n logger.warn(`The precached response for ` +\n `${getFriendlyURL(precachedURL)} in ${cacheName} was not found. ` +\n `Falling back to the network instead.`);\n }\n\n return fetch(precachedURL);\n });\n\n if (process.env.NODE_ENV !== 'production') {\n responsePromise = responsePromise.then((response) => {\n // Workbox is going to handle the route.\n // print the routing details to the console.\n logger.groupCollapsed(`Precaching is responding to: ` +\n getFriendlyURL(event.request.url));\n logger.log(`Serving the precached url: ${precachedURL}`);\n\n logger.groupCollapsed(`View request details here.`);\n logger.log(event.request);\n logger.groupEnd();\n\n logger.groupCollapsed(`View response details here.`);\n logger.log(response);\n logger.groupEnd();\n\n logger.groupEnd();\n return response;\n });\n }\n\n event.respondWith(responsePromise);\n });\n};\n","\n/*\n Copyright 2019 Google LLC\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addFetchListener} from './utils/addFetchListener.mjs';\nimport './_version.mjs';\n\n\nlet listenerAdded = false;\n\n/**\n * Add a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n *\n * @alias workbox.precaching.addRoute\n */\nexport const addRoute = (options) => {\n if (!listenerAdded) {\n addFetchListener(options);\n listenerAdded = true;\n }\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\nconst SUBSTRING_TO_FIND = '-precache-';\n\n/**\n * Cleans up incompatible precaches that were created by older versions of\n * Workbox, by a service worker registered under the current scope.\n *\n * This is meant to be called as part of the `activate` event.\n *\n * This should be safe to use as long as you don't include `substringToFind`\n * (defaulting to `-precache-`) in your non-precache cache names.\n *\n * @param {string} currentPrecacheName The cache name currently in use for\n * precaching. This cache won't be deleted.\n * @param {string} [substringToFind='-precache-'] Cache names which include this\n * substring will be deleted (excluding `currentPrecacheName`).\n * @return {Array} A list of all the cache names that were deleted.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nconst deleteOutdatedCaches = async (\n currentPrecacheName,\n substringToFind = SUBSTRING_TO_FIND) => {\n const cacheNames = await caches.keys();\n\n const cacheNamesToDelete = cacheNames.filter((cacheName) => {\n return cacheName.includes(substringToFind) &&\n cacheName.includes(self.registration.scope) &&\n cacheName !== currentPrecacheName;\n });\n\n await Promise.all(\n cacheNamesToDelete.map((cacheName) => caches.delete(cacheName)));\n\n return cacheNamesToDelete;\n};\n\nexport {deleteOutdatedCaches};\n\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {deleteOutdatedCaches} from './utils/deleteOutdatedCaches.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds an `activate` event listener which will clean up incompatible\n * precaches that were created by older versions of Workbox.\n *\n * @alias workbox.precaching.cleanupOutdatedCaches\n */\nexport const cleanupOutdatedCaches = () => {\n addEventListener('activate', (event) => {\n const cacheName = cacheNames.getPrecacheName();\n\n event.waitUntil(deleteOutdatedCaches(cacheName).then((cachesDeleted) => {\n if (process.env.NODE_ENV !== 'production') {\n if (cachesDeleted.length > 0) {\n logger.log(`The following out-of-date precaches were cleaned up ` +\n `automatically:`, cachesDeleted);\n }\n }\n }));\n });\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './utils/getOrCreatePrecacheController.mjs';\nimport './_version.mjs';\n\n\n/**\n * Takes in a URL, and returns the corresponding URL that could be used to\n * lookup the entry in the precache.\n *\n * If a relative URL is provided, the location of the service worker file will\n * be used as the base.\n *\n * For precached entries without revision information, the cache key will be the\n * same as the original URL.\n *\n * For precached entries with revision information, the cache key will be the\n * original URL with the addition of a query parameter used for keeping track of\n * the revision info.\n *\n * @param {string} url The URL whose cache key to look up.\n * @return {string} The cache key that corresponds to that URL.\n *\n * @alias workbox.precaching.getCacheKeyForURL\n */\nexport const getCacheKeyForURL = (url) => {\n const precacheController = getOrCreatePrecacheController();\n return precacheController.getCacheKeyForURL(url);\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getOrCreatePrecacheController} from './utils/getOrCreatePrecacheController.mjs';\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\nconst installListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(\n precacheController.install({event, plugins})\n .catch((error) => {\n if (process.env.NODE_ENV !== 'production') {\n logger.error(`Service worker installation failed. It will ` +\n `be retried automatically during the next navigation.`);\n }\n // Re-throw the error to ensure installation fails.\n throw error;\n })\n );\n};\n\nconst activateListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(precacheController.activate({event, plugins}));\n};\n\n/**\n * Adds items to the precache list, removing any duplicates and\n * stores the files in the\n * [\"precache cache\"]{@link module:workbox-core.cacheNames} when the service\n * worker installs.\n *\n * This method can be called multiple times.\n *\n * Please note: This method **will not** serve any of the cached files for you.\n * It only precaches files. To respond to a network request you call\n * [addRoute()]{@link module:workbox-precaching.addRoute}.\n *\n * If you have a single array of files to precache, you can just call\n * [precacheAndRoute()]{@link module:workbox-precaching.precacheAndRoute}.\n *\n * @param {Array} entries Array of entries to precache.\n *\n * @alias workbox.precaching.precache\n */\nexport const precache = (entries) => {\n const precacheController = getOrCreatePrecacheController();\n precacheController.addToCacheList(entries);\n\n if (entries.length > 0) {\n // NOTE: these listeners will only be added once (even if the `precache()`\n // method is called multiple times) because event listeners are implemented\n // as a set, where each listener must be unique.\n addEventListener('install', installListener);\n addEventListener('activate', activateListener);\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addRoute} from './addRoute.mjs';\nimport {precache} from './precache.mjs';\nimport './_version.mjs';\n\n\n/**\n * This method will add entries to the precache list and add a route to\n * respond to fetch events.\n *\n * This is a convenience method that will call\n * [precache()]{@link module:workbox-precaching.precache} and\n * [addRoute()]{@link module:workbox-precaching.addRoute} in a single call.\n *\n * @param {Array} entries Array of entries to precache.\n * @param {Object} options See\n * [addRoute() options]{@link module:workbox-precaching.addRoute}.\n *\n * @alias workbox.precaching.precacheAndRoute\n */\nexport const precacheAndRoute = (entries, options) => {\n precache(entries);\n addRoute(options);\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {assert} from 'workbox-core/_private/assert.mjs';\nimport {addPlugins} from './addPlugins.mjs';\nimport {addRoute} from './addRoute.mjs';\nimport {cleanupOutdatedCaches} from './cleanupOutdatedCaches.mjs';\nimport {getCacheKeyForURL} from './getCacheKeyForURL.mjs';\nimport {precache} from './precache.mjs';\nimport {precacheAndRoute} from './precacheAndRoute.mjs';\nimport {PrecacheController} from './PrecacheController.mjs';\nimport './_version.mjs';\n\n\nif (process.env.NODE_ENV !== 'production') {\n assert.isSWEnv('workbox-precaching');\n}\n\n/**\n * Most consumers of this module will want to use the\n * [precacheAndRoute()]{@link workbox.precaching.precacheAndRoute}\n * method to add assets to the Cache and respond to network requests with these\n * cached assets.\n *\n * If you require finer grained control, you can use the\n * [PrecacheController]{@link workbox.precaching.PrecacheController}\n * to determine when performed.\n *\n * @namespace workbox.precaching\n */\n\nexport {\n addPlugins,\n addRoute,\n cleanupOutdatedCaches,\n getCacheKeyForURL,\n precache,\n precacheAndRoute,\n PrecacheController,\n};\n"],"names":["self","_","e","plugins","precachePlugins","get","add","newPlugins","push","addPlugins","cleanRedirect","response","clonedResponse","clone","bodyPromise","Promise","resolve","body","blob","Response","headers","status","statusText","REVISION_SEARCH_PARAM","createCacheKey","entry","WorkboxError","urlObject","URL","location","cacheKey","href","url","revision","originalURL","cacheKeyURL","searchParams","set","logGroup","groupTitle","deletedURLs","logger","groupCollapsed","log","groupEnd","printCleanupDetails","deletionCount","length","_nestedGroup","urls","printInstallDetails","urlsToPrecache","urlsAlreadyPrecached","precachedCount","alreadyPrecachedCount","message","PrecacheController","constructor","cacheName","_cacheName","cacheNames","getPrecacheName","_urlsToCacheKeys","Map","addToCacheList","entries","assert","isArray","moduleName","className","funcName","paramName","has","firstEntry","secondEntry","install","event","cache","caches","open","alreadyCachedRequests","keys","alreadyCachedURLs","Set","map","request","values","precacheRequests","_addURLToCache","all","updatedURLs","notUpdatedURLs","activate","currentlyCachedRequests","expectedCacheKeys","delete","Request","credentials","fetchWrapper","fetch","cacheWillUpdateCallback","plugin","cacheWillUpdate","bind","isValidResponse","redirected","cacheWrapper","put","matchOptions","ignoreSearch","getURLsToCacheKeys","getCachedURLs","getCacheKeyForURL","precacheController","getOrCreatePrecacheController","removeIgnoredSearchParams","ignoreURLParametersMatching","some","regExp","test","generateURLVariations","directoryIndex","cleanURLs","urlManipulation","hash","urlWithoutIgnoredParams","pathname","endsWith","directoryURL","cleanURL","additionalURLs","urlToAttempt","options","urlsToCacheKeys","possibleURL","possibleCacheKey","addFetchListener","addEventListener","precachedURL","debug","getFriendlyURL","responsePromise","then","match","cachedResponse","warn","respondWith","listenerAdded","addRoute","SUBSTRING_TO_FIND","deleteOutdatedCaches","currentPrecacheName","substringToFind","cacheNamesToDelete","filter","includes","registration","scope","cleanupOutdatedCaches","waitUntil","cachesDeleted","installListener","catch","error","activateListener","precache","precacheAndRoute","isSWEnv"],"mappings":";;;;EAAA,IAAG;EAACA,EAAAA,IAAI,CAAC,0BAAD,CAAJ,IAAkCC,CAAC,EAAnC;EAAsC,CAA1C,CAA0C,OAAMC,CAAN,EAAQ;;ECAlD;;;;;;;AAQA,EAGA,MAAMC,OAAO,GAAG,EAAhB;AAEA,EAAO,MAAMC,eAAe,GAAG;EAC7B;;;;EAIAC,EAAAA,GAAG,GAAG;EACJ,WAAOF,OAAP;EACD,GAP4B;;EAS7B;;;;EAIAG,EAAAA,GAAG,CAACC,UAAD,EAAa;EACdJ,IAAAA,OAAO,CAACK,IAAR,CAAa,GAAGD,UAAhB;EACD;;EAf4B,CAAxB;;ECbP;;;;;;;AAQA,EAIA;;;;;;;;AAOA,QAAME,UAAU,GAAIF,UAAD,IAAgB;EACjCH,EAAAA,eAAe,CAACE,GAAhB,CAAoBC,UAApB;EACD,CAFD;;ECnBA;;;;;;;AAQA,EAEA;;;;;;;;AAOA,EAAO,eAAeG,aAAf,CAA6BC,QAA7B,EAAuC;EAC5C,QAAMC,cAAc,GAAGD,QAAQ,CAACE,KAAT,EAAvB,CAD4C;EAI5C;;EACA,QAAMC,WAAW,GAAG,UAAUF,cAAV,GAClBG,OAAO,CAACC,OAAR,CAAgBJ,cAAc,CAACK,IAA/B,CADkB,GAElBL,cAAc,CAACM,IAAf,EAFF;EAIA,QAAMD,IAAI,GAAG,MAAMH,WAAnB,CAT4C;;EAY5C,SAAO,IAAIK,QAAJ,CAAaF,IAAb,EAAmB;EACxBG,IAAAA,OAAO,EAAER,cAAc,CAACQ,OADA;EAExBC,IAAAA,MAAM,EAAET,cAAc,CAACS,MAFC;EAGxBC,IAAAA,UAAU,EAAEV,cAAc,CAACU;EAHH,GAAnB,CAAP;EAKD;;EClCD;;;;;;;AAQA;EAKA,MAAMC,qBAAqB,GAAG,iBAA9B;EAEA;;;;;;;;;;AASA,EAAO,SAASC,cAAT,CAAwBC,KAAxB,EAA+B;EACpC,MAAI,CAACA,KAAL,EAAY;EACV,UAAM,IAAIC,6BAAJ,CAAiB,mCAAjB,EAAsD;EAACD,MAAAA;EAAD,KAAtD,CAAN;EACD,GAHmC;EAMpC;;;EACA,MAAI,OAAOA,KAAP,KAAiB,QAArB,EAA+B;EAC7B,UAAME,SAAS,GAAG,IAAIC,GAAJ,CAAQH,KAAR,EAAeI,QAAf,CAAlB;EACA,WAAO;EACLC,MAAAA,QAAQ,EAAEH,SAAS,CAACI,IADf;EAELC,MAAAA,GAAG,EAAEL,SAAS,CAACI;EAFV,KAAP;EAID;;EAED,QAAM;EAACE,IAAAA,QAAD;EAAWD,IAAAA;EAAX,MAAkBP,KAAxB;;EACA,MAAI,CAACO,GAAL,EAAU;EACR,UAAM,IAAIN,6BAAJ,CAAiB,mCAAjB,EAAsD;EAACD,MAAAA;EAAD,KAAtD,CAAN;EACD,GAlBmC;EAqBpC;;;EACA,MAAI,CAACQ,QAAL,EAAe;EACb,UAAMN,SAAS,GAAG,IAAIC,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAAlB;EACA,WAAO;EACLC,MAAAA,QAAQ,EAAEH,SAAS,CAACI,IADf;EAELC,MAAAA,GAAG,EAAEL,SAAS,CAACI;EAFV,KAAP;EAID,GA5BmC;EA+BpC;;;EACA,QAAMG,WAAW,GAAG,IAAIN,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAApB;EACA,QAAMM,WAAW,GAAG,IAAIP,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAApB;EACAM,EAAAA,WAAW,CAACC,YAAZ,CAAyBC,GAAzB,CAA6Bd,qBAA7B,EAAoDU,QAApD;EACA,SAAO;EACLH,IAAAA,QAAQ,EAAEK,WAAW,CAACJ,IADjB;EAELC,IAAAA,GAAG,EAAEE,WAAW,CAACH;EAFZ,GAAP;EAID;;EC/DD;;;;;;;AAQA;EAIA,MAAMO,QAAQ,GAAG,CAACC,UAAD,EAAaC,WAAb,KAA6B;EAC5CC,EAAAA,iBAAM,CAACC,cAAP,CAAsBH,UAAtB;;EAEA,OAAK,MAAMP,GAAX,IAAkBQ,WAAlB,EAA+B;EAC7BC,IAAAA,iBAAM,CAACE,GAAP,CAAWX,GAAX;EACD;;EAEDS,EAAAA,iBAAM,CAACG,QAAP;EACD,CARD;EAUA;;;;;;;;AAMA,EAAO,SAASC,mBAAT,CAA6BL,WAA7B,EAA0C;EAC/C,QAAMM,aAAa,GAAGN,WAAW,CAACO,MAAlC;;EACA,MAAID,aAAa,GAAG,CAApB,EAAuB;EACrBL,IAAAA,iBAAM,CAACC,cAAP,CAAuB,6BAAD,GACjB,GAAEI,aAAc,UADC,GAEjB,UAASA,aAAa,KAAK,CAAlB,GAAsB,MAAtB,GAA+B,QAAS,WAFtD;EAGAR,IAAAA,QAAQ,CAAC,wBAAD,EAA2BE,WAA3B,CAAR;EACAC,IAAAA,iBAAM,CAACG,QAAP;EACD;EACF;;ECrCD;;;;;;;AAQA,EAIA;;;;;;;EAMA,SAASI,YAAT,CAAsBT,UAAtB,EAAkCU,IAAlC,EAAwC;EACtC,MAAIA,IAAI,CAACF,MAAL,KAAgB,CAApB,EAAuB;EACrB;EACD;;EAEDN,EAAAA,iBAAM,CAACC,cAAP,CAAsBH,UAAtB;;EAEA,OAAK,MAAMP,GAAX,IAAkBiB,IAAlB,EAAwB;EACtBR,IAAAA,iBAAM,CAACE,GAAP,CAAWX,GAAX;EACD;;EAEDS,EAAAA,iBAAM,CAACG,QAAP;EACD;EAED;;;;;;;;;AAOA,EAAO,SAASM,mBAAT,CAA6BC,cAA7B,EAA6CC,oBAA7C,EAAmE;EACxE,QAAMC,cAAc,GAAGF,cAAc,CAACJ,MAAtC;EACA,QAAMO,qBAAqB,GAAGF,oBAAoB,CAACL,MAAnD;;EAEA,MAAIM,cAAc,IAAIC,qBAAtB,EAA6C;EAC3C,QAAIC,OAAO,GACN,cAAaF,cAAe,QAAOA,cAAc,KAAK,CAAnB,GAAuB,EAAvB,GAA4B,GAAI,GADxE;;EAGA,QAAIC,qBAAqB,GAAG,CAA5B,EAA+B;EAC7BC,MAAAA,OAAO,IAAK,IAAGD,qBAAsB,GAA1B,GACR,OAAMA,qBAAqB,KAAK,CAA1B,GAA8B,KAA9B,GAAsC,OAAQ,kBADvD;EAED;;EAEDb,IAAAA,iBAAM,CAACC,cAAP,CAAsBa,OAAtB;;EAEAP,IAAAA,YAAY,CAAE,4BAAF,EAA+BG,cAA/B,CAAZ;;EACAH,IAAAA,YAAY,CAAE,iCAAF,EAAoCI,oBAApC,CAAZ;;EACAX,IAAAA,iBAAM,CAACG,QAAP;EACD;EACF;;EC1DD;;;;;;;AAQA,EAcA;;;;;;EAKA,MAAMY,kBAAN,CAAyB;EACvB;;;;;;EAMAC,EAAAA,WAAW,CAACC,SAAD,EAAY;EACrB,SAAKC,UAAL,GAAkBC,yBAAU,CAACC,eAAX,CAA2BH,SAA3B,CAAlB;EACA,SAAKI,gBAAL,GAAwB,IAAIC,GAAJ,EAAxB;EACD;EAED;;;;;;;;;;EAQAC,EAAAA,cAAc,CAACC,OAAD,EAAU;EACtB,IAA2C;EACzCC,MAAAA,iBAAM,CAACC,OAAP,CAAeF,OAAf,EAAwB;EACtBG,QAAAA,UAAU,EAAE,oBADU;EAEtBC,QAAAA,SAAS,EAAE,oBAFW;EAGtBC,QAAAA,QAAQ,EAAE,gBAHY;EAItBC,QAAAA,SAAS,EAAE;EAJW,OAAxB;EAMD;;EAED,SAAK,MAAM9C,KAAX,IAAoBwC,OAApB,EAA6B;EAC3B,YAAM;EAACnC,QAAAA,QAAD;EAAWE,QAAAA;EAAX,UAAkBR,cAAc,CAACC,KAAD,CAAtC;;EACA,UAAI,KAAKqC,gBAAL,CAAsBU,GAAtB,CAA0BxC,GAA1B,KACA,KAAK8B,gBAAL,CAAsBzD,GAAtB,CAA0B2B,GAA1B,MAAmCF,QADvC,EACiD;EAC/C,cAAM,IAAIJ,6BAAJ,CAAiB,uCAAjB,EAA0D;EAC9D+C,UAAAA,UAAU,EAAE,KAAKX,gBAAL,CAAsBzD,GAAtB,CAA0B2B,GAA1B,CADkD;EAE9D0C,UAAAA,WAAW,EAAE5C;EAFiD,SAA1D,CAAN;EAID;;EACD,WAAKgC,gBAAL,CAAsBzB,GAAtB,CAA0BL,GAA1B,EAA+BF,QAA/B;EACD;EACF;EAED;;;;;;;;;;;;EAUA,QAAM6C,OAAN,CAAc;EAACC,IAAAA,KAAD;EAAQzE,IAAAA;EAAR,MAAmB,EAAjC,EAAqC;EACnC,IAA2C;EACzC,UAAIA,OAAJ,EAAa;EACX+D,QAAAA,iBAAM,CAACC,OAAP,CAAehE,OAAf,EAAwB;EACtBiE,UAAAA,UAAU,EAAE,oBADU;EAEtBC,UAAAA,SAAS,EAAE,oBAFW;EAGtBC,UAAAA,QAAQ,EAAE,SAHY;EAItBC,UAAAA,SAAS,EAAE;EAJW,SAAxB;EAMD;EACF;;EAED,UAAMpB,cAAc,GAAG,EAAvB;EACA,UAAMC,oBAAoB,GAAG,EAA7B;EAEA,UAAMyB,KAAK,GAAG,MAAMC,MAAM,CAACC,IAAP,CAAY,KAAKpB,UAAjB,CAApB;EACA,UAAMqB,qBAAqB,GAAG,MAAMH,KAAK,CAACI,IAAN,EAApC;EACA,UAAMC,iBAAiB,GAAG,IAAIC,GAAJ,CAAQH,qBAAqB,CAACI,GAAtB,CAC7BC,OAAD,IAAaA,OAAO,CAACrD,GADS,CAAR,CAA1B;;EAGA,SAAK,MAAMF,QAAX,IAAuB,KAAKgC,gBAAL,CAAsBwB,MAAtB,EAAvB,EAAuD;EACrD,UAAIJ,iBAAiB,CAACV,GAAlB,CAAsB1C,QAAtB,CAAJ,EAAqC;EACnCsB,QAAAA,oBAAoB,CAAC5C,IAArB,CAA0BsB,QAA1B;EACD,OAFD,MAEO;EACLqB,QAAAA,cAAc,CAAC3C,IAAf,CAAoBsB,QAApB;EACD;EACF;;EAED,UAAMyD,gBAAgB,GAAGpC,cAAc,CAACiC,GAAf,CAAoBpD,GAAD,IAAS;EACnD,aAAO,KAAKwD,cAAL,CAAoB;EAACZ,QAAAA,KAAD;EAAQzE,QAAAA,OAAR;EAAiB6B,QAAAA;EAAjB,OAApB,CAAP;EACD,KAFwB,CAAzB;EAGA,UAAMjB,OAAO,CAAC0E,GAAR,CAAYF,gBAAZ,CAAN;;EAEA,IAA2C;EACzCrC,MAAAA,mBAAmB,CAACC,cAAD,EAAiBC,oBAAjB,CAAnB;EACD;;EAED,WAAO;EACLsC,MAAAA,WAAW,EAAEvC,cADR;EAELwC,MAAAA,cAAc,EAAEvC;EAFX,KAAP;EAID;EAED;;;;;;;;EAMA,QAAMwC,QAAN,GAAiB;EACf,UAAMf,KAAK,GAAG,MAAMC,MAAM,CAACC,IAAP,CAAY,KAAKpB,UAAjB,CAApB;EACA,UAAMkC,uBAAuB,GAAG,MAAMhB,KAAK,CAACI,IAAN,EAAtC;EACA,UAAMa,iBAAiB,GAAG,IAAIX,GAAJ,CAAQ,KAAKrB,gBAAL,CAAsBwB,MAAtB,EAAR,CAA1B;EAEA,UAAM9C,WAAW,GAAG,EAApB;;EACA,SAAK,MAAM6C,OAAX,IAAsBQ,uBAAtB,EAA+C;EAC7C,UAAI,CAACC,iBAAiB,CAACtB,GAAlB,CAAsBa,OAAO,CAACrD,GAA9B,CAAL,EAAyC;EACvC,cAAM6C,KAAK,CAACkB,MAAN,CAAaV,OAAb,CAAN;EACA7C,QAAAA,WAAW,CAAChC,IAAZ,CAAiB6E,OAAO,CAACrD,GAAzB;EACD;EACF;;EAED,IAA2C;EACzCa,MAAAA,mBAAmB,CAACL,WAAD,CAAnB;EACD;;EAED,WAAO;EAACA,MAAAA;EAAD,KAAP;EACD;EAED;;;;;;;;;;;;;;;;;;EAgBA,QAAMgD,cAAN,CAAqB;EAACxD,IAAAA,GAAD;EAAM4C,IAAAA,KAAN;EAAazE,IAAAA;EAAb,GAArB,EAA4C;EAC1C,UAAMkF,OAAO,GAAG,IAAIW,OAAJ,CAAYhE,GAAZ,EAAiB;EAACiE,MAAAA,WAAW,EAAE;EAAd,KAAjB,CAAhB;EACA,QAAItF,QAAQ,GAAG,MAAMuF,6BAAY,CAACC,KAAb,CAAmB;EACtCvB,MAAAA,KADsC;EAEtCzE,MAAAA,OAFsC;EAGtCkF,MAAAA;EAHsC,KAAnB,CAArB,CAF0C;EAS1C;EACA;;EACA,QAAIe,uBAAJ;;EACA,SAAK,MAAMC,MAAX,IAAsBlG,OAAO,IAAI,EAAjC,EAAsC;EACpC,UAAI,qBAAqBkG,MAAzB,EAAiC;EAC/BD,QAAAA,uBAAuB,GAAGC,MAAM,CAACC,eAAP,CAAuBC,IAAvB,CAA4BF,MAA5B,CAA1B;EACD;EACF;;EAED,UAAMG,eAAe,GAAGJ,uBAAuB;EAE7CA,IAAAA,uBAAuB,CAAC;EAACxB,MAAAA,KAAD;EAAQS,MAAAA,OAAR;EAAiB1E,MAAAA;EAAjB,KAAD,CAFsB;EAI7C;EACAA,IAAAA,QAAQ,CAACU,MAAT,GAAkB,GALpB,CAlB0C;EA0B1C;;EACA,QAAI,CAACmF,eAAL,EAAsB;EACpB,YAAM,IAAI9E,6BAAJ,CAAiB,yBAAjB,EAA4C;EAChDM,QAAAA,GADgD;EAEhDX,QAAAA,MAAM,EAAEV,QAAQ,CAACU;EAF+B,OAA5C,CAAN;EAID;;EAED,QAAIV,QAAQ,CAAC8F,UAAb,EAAyB;EACvB9F,MAAAA,QAAQ,GAAG,MAAMD,aAAa,CAACC,QAAD,CAA9B;EACD;;EAED,UAAM+F,6BAAY,CAACC,GAAb,CAAiB;EACrB/B,MAAAA,KADqB;EAErBzE,MAAAA,OAFqB;EAGrBkF,MAAAA,OAHqB;EAIrB1E,MAAAA,QAJqB;EAKrB+C,MAAAA,SAAS,EAAE,KAAKC,UALK;EAMrBiD,MAAAA,YAAY,EAAE;EACZC,QAAAA,YAAY,EAAE;EADF;EANO,KAAjB,CAAN;EAUD;EAED;;;;;;;;EAMAC,EAAAA,kBAAkB,GAAG;EACnB,WAAO,KAAKhD,gBAAZ;EACD;EAED;;;;;;;;EAMAiD,EAAAA,aAAa,GAAG;EACd,WAAO,CAAC,GAAG,KAAKjD,gBAAL,CAAsBmB,IAAtB,EAAJ,CAAP;EACD;EAED;;;;;;;;;;;EASA+B,EAAAA,iBAAiB,CAAChF,GAAD,EAAM;EACrB,UAAML,SAAS,GAAG,IAAIC,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAAlB;EACA,WAAO,KAAKiC,gBAAL,CAAsBzD,GAAtB,CAA0BsB,SAAS,CAACI,IAApC,CAAP;EACD;;EA5NsB;;EC3BzB;;;;;;;AAQA,EAIA,IAAIkF,kBAAJ;EAEA;;;;;AAIA,EAAO,MAAMC,6BAA6B,GAAG,MAAM;EACjD,MAAI,CAACD,kBAAL,EAAyB;EACvBA,IAAAA,kBAAkB,GAAG,IAAIzD,kBAAJ,EAArB;EACD;;EACD,SAAOyD,kBAAP;EACD,CALM;;EClBP;;;;;;;AAQA,EAEA;;;;;;;;;;;;;AAYA,EAAO,SAASE,yBAAT,CAAmCxF,SAAnC,EACHyF,2BADG,EAC0B;EAC/B;EACA;EACA,OAAK,MAAM7C,SAAX,IAAwB,CAAC,GAAG5C,SAAS,CAACS,YAAV,CAAuB6C,IAAvB,EAAJ,CAAxB,EAA4D;EAC1D,QAAImC,2BAA2B,CAACC,IAA5B,CAAkCC,MAAD,IAAYA,MAAM,CAACC,IAAP,CAAYhD,SAAZ,CAA7C,CAAJ,EAA0E;EACxE5C,MAAAA,SAAS,CAACS,YAAV,CAAuB2D,MAAvB,CAA8BxB,SAA9B;EACD;EACF;;EAED,SAAO5C,SAAP;EACD;;ECjCD;;;;;;;AAQA,EAIA;;;;;;;;;;;AAUA,EAAO,UAAU6F,qBAAV,CAAgCxF,GAAhC,EAAqC;EAC1CoF,EAAAA,2BAD0C;EAE1CK,EAAAA,cAF0C;EAG1CC,EAAAA,SAH0C;EAI1CC,EAAAA;EAJ0C,IAKxC,EALG,EAKC;EACN,QAAMhG,SAAS,GAAG,IAAIC,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAAlB;EACAF,EAAAA,SAAS,CAACiG,IAAV,GAAiB,EAAjB;EACA,QAAMjG,SAAS,CAACI,IAAhB;EAEA,QAAM8F,uBAAuB,GAAGV,yBAAyB,CACrDxF,SADqD,EAC1CyF,2BAD0C,CAAzD;EAEA,QAAMS,uBAAuB,CAAC9F,IAA9B;;EAEA,MAAI0F,cAAc,IAAII,uBAAuB,CAACC,QAAxB,CAAiCC,QAAjC,CAA0C,GAA1C,CAAtB,EAAsE;EACpE,UAAMC,YAAY,GAAG,IAAIpG,GAAJ,CAAQiG,uBAAR,CAArB;EACAG,IAAAA,YAAY,CAACF,QAAb,IAAyBL,cAAzB;EACA,UAAMO,YAAY,CAACjG,IAAnB;EACD;;EAED,MAAI2F,SAAJ,EAAe;EACb,UAAMO,QAAQ,GAAG,IAAIrG,GAAJ,CAAQiG,uBAAR,CAAjB;EACAI,IAAAA,QAAQ,CAACH,QAAT,IAAqB,OAArB;EACA,UAAMG,QAAQ,CAAClG,IAAf;EACD;;EAED,MAAI4F,eAAJ,EAAqB;EACnB,UAAMO,cAAc,GAAGP,eAAe,CAAC;EAAC3F,MAAAA,GAAG,EAAEL;EAAN,KAAD,CAAtC;;EACA,SAAK,MAAMwG,YAAX,IAA2BD,cAA3B,EAA2C;EACzC,YAAMC,YAAY,CAACpG,IAAnB;EACD;EACF;EACF;;ECtDD;;;;;;;AAQA,EAKA;;;;;;;;;;;;AAWA,EAAO,MAAMiF,iBAAiB,GAAG,CAAChF,GAAD,EAAMoG,OAAN,KAAkB;EACjD,QAAMnB,kBAAkB,GAAGC,6BAA6B,EAAxD;EAEA,QAAMmB,eAAe,GAAGpB,kBAAkB,CAACH,kBAAnB,EAAxB;;EACA,OAAK,MAAMwB,WAAX,IAA0Bd,qBAAqB,CAACxF,GAAD,EAAMoG,OAAN,CAA/C,EAA+D;EAC7D,UAAMG,gBAAgB,GAAGF,eAAe,CAAChI,GAAhB,CAAoBiI,WAApB,CAAzB;;EACA,QAAIC,gBAAJ,EAAsB;EACpB,aAAOA,gBAAP;EACD;EACF;EACF,CAVM;;ECxBP;;;;;;;AAQA,EAOA;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA2BA,EAAO,MAAMC,gBAAgB,GAAG,CAAC;EAC/BpB,EAAAA,2BAA2B,GAAG,CAAC,OAAD,CADC;EAE/BK,EAAAA,cAAc,GAAG,YAFc;EAG/BC,EAAAA,SAAS,GAAG,IAHmB;EAI/BC,EAAAA,eAAe,GAAG;EAJa,IAK7B,EAL4B,KAKrB;EACT,QAAMjE,SAAS,GAAGE,yBAAU,CAACC,eAAX,EAAlB;EAEA4E,EAAAA,gBAAgB,CAAC,OAAD,EAAW7D,KAAD,IAAW;EACnC,UAAM8D,YAAY,GAAG1B,iBAAiB,CAACpC,KAAK,CAACS,OAAN,CAAcrD,GAAf,EAAoB;EACxD0F,MAAAA,SADwD;EAExDD,MAAAA,cAFwD;EAGxDL,MAAAA,2BAHwD;EAIxDO,MAAAA;EAJwD,KAApB,CAAtC;;EAMA,QAAI,CAACe,YAAL,EAAmB;EACjB,MAA2C;EACzCjG,QAAAA,iBAAM,CAACkG,KAAP,CAAc,sCAAD,GACXC,iCAAc,CAAChE,KAAK,CAACS,OAAN,CAAcrD,GAAf,CADhB;EAED;;EACD;EACD;;EAED,QAAI6G,eAAe,GAAG/D,MAAM,CAACC,IAAP,CAAYrB,SAAZ,EAAuBoF,IAAvB,CAA6BjE,KAAD,IAAW;EAC3D,aAAOA,KAAK,CAACkE,KAAN,CAAYL,YAAZ,CAAP;EACD,KAFqB,EAEnBI,IAFmB,CAEbE,cAAD,IAAoB;EAC1B,UAAIA,cAAJ,EAAoB;EAClB,eAAOA,cAAP;EACD,OAHyB;EAM1B;;;EACA,MAA2C;EACzCvG,QAAAA,iBAAM,CAACwG,IAAP,CAAa,6BAAD,GACX,GAAEL,iCAAc,CAACF,YAAD,CAAe,OAAMhF,SAAU,kBADpC,GAEX,sCAFD;EAGD;;EAED,aAAOyC,KAAK,CAACuC,YAAD,CAAZ;EACD,KAhBqB,CAAtB;;EAkBA,IAA2C;EACzCG,MAAAA,eAAe,GAAGA,eAAe,CAACC,IAAhB,CAAsBnI,QAAD,IAAc;EACnD;EACA;EACA8B,QAAAA,iBAAM,CAACC,cAAP,CAAuB,+BAAD,GACpBkG,iCAAc,CAAChE,KAAK,CAACS,OAAN,CAAcrD,GAAf,CADhB;EAEAS,QAAAA,iBAAM,CAACE,GAAP,CAAY,8BAA6B+F,YAAa,EAAtD;EAEAjG,QAAAA,iBAAM,CAACC,cAAP,CAAuB,4BAAvB;EACAD,QAAAA,iBAAM,CAACE,GAAP,CAAWiC,KAAK,CAACS,OAAjB;EACA5C,QAAAA,iBAAM,CAACG,QAAP;EAEAH,QAAAA,iBAAM,CAACC,cAAP,CAAuB,6BAAvB;EACAD,QAAAA,iBAAM,CAACE,GAAP,CAAWhC,QAAX;EACA8B,QAAAA,iBAAM,CAACG,QAAP;EAEAH,QAAAA,iBAAM,CAACG,QAAP;EACA,eAAOjC,QAAP;EACD,OAjBiB,CAAlB;EAkBD;;EAEDiE,IAAAA,KAAK,CAACsE,WAAN,CAAkBL,eAAlB;EACD,GAvDe,CAAhB;EAwDD,CAhEM;;ECzCP;;;;;;AAOA,EAIA,IAAIM,aAAa,GAAG,KAApB;EAEA;;;;;;;;;;;;;;;;;;;;;;;;;AAwBA,QAAaC,QAAQ,GAAIhB,OAAD,IAAa;EACnC,MAAI,CAACe,aAAL,EAAoB;EAClBX,IAAAA,gBAAgB,CAACJ,OAAD,CAAhB;EACAe,IAAAA,aAAa,GAAG,IAAhB;EACD;EACF,CALM;;ECtCP;;;;;;;AAQA,EAEA,MAAME,iBAAiB,GAAG,YAA1B;EAEA;;;;;;;;;;;;;;;;;;;EAkBA,MAAMC,oBAAoB,GAAG,OAC3BC,mBAD2B,EAE3BC,eAAe,GAAGH,iBAFS,KAEa;EACxC,QAAMzF,UAAU,GAAG,MAAMkB,MAAM,CAACG,IAAP,EAAzB;EAEA,QAAMwE,kBAAkB,GAAG7F,UAAU,CAAC8F,MAAX,CAAmBhG,SAAD,IAAe;EAC1D,WAAOA,SAAS,CAACiG,QAAV,CAAmBH,eAAnB,KACA9F,SAAS,CAACiG,QAAV,CAAmB3J,IAAI,CAAC4J,YAAL,CAAkBC,KAArC,CADA,IAEAnG,SAAS,KAAK6F,mBAFrB;EAGD,GAJ0B,CAA3B;EAMA,QAAMxI,OAAO,CAAC0E,GAAR,CACFgE,kBAAkB,CAACrE,GAAnB,CAAwB1B,SAAD,IAAeoB,MAAM,CAACiB,MAAP,CAAcrC,SAAd,CAAtC,CADE,CAAN;EAGA,SAAO+F,kBAAP;EACD,CAfD;;EC9BA;;;;;;;AAQA,EAMA;;;;;;;AAMA,QAAaK,qBAAqB,GAAG,MAAM;EACzCrB,EAAAA,gBAAgB,CAAC,UAAD,EAAc7D,KAAD,IAAW;EACtC,UAAMlB,SAAS,GAAGE,yBAAU,CAACC,eAAX,EAAlB;EAEAe,IAAAA,KAAK,CAACmF,SAAN,CAAgBT,oBAAoB,CAAC5F,SAAD,CAApB,CAAgCoF,IAAhC,CAAsCkB,aAAD,IAAmB;EACtE,MAA2C;EACzC,YAAIA,aAAa,CAACjH,MAAd,GAAuB,CAA3B,EAA8B;EAC5BN,UAAAA,iBAAM,CAACE,GAAP,CAAY,sDAAD,GACN,gBADL,EACsBqH,aADtB;EAED;EACF;EACF,KAPe,CAAhB;EAQD,GAXe,CAAhB;EAYD,CAbM;;ECpBP;;;;;;;AAQA,EAKA;;;;;;;;;;;;;;;;;;;;AAmBA,QAAahD,mBAAiB,GAAIhF,GAAD,IAAS;EACxC,QAAMiF,kBAAkB,GAAGC,6BAA6B,EAAxD;EACA,SAAOD,kBAAkB,CAACD,iBAAnB,CAAqChF,GAArC,CAAP;EACD,CAHM;;EChCP;;;;;;;AAQA;EAMA,MAAMiI,eAAe,GAAIrF,KAAD,IAAW;EACjC,QAAMqC,kBAAkB,GAAGC,6BAA6B,EAAxD;EACA,QAAM/G,OAAO,GAAGC,eAAe,CAACC,GAAhB,EAAhB;EAEAuE,EAAAA,KAAK,CAACmF,SAAN,CACI9C,kBAAkB,CAACtC,OAAnB,CAA2B;EAACC,IAAAA,KAAD;EAAQzE,IAAAA;EAAR,GAA3B,EACK+J,KADL,CACYC,KAAD,IAAW;EAChB,IAA2C;EACzC1H,MAAAA,iBAAM,CAAC0H,KAAP,CAAc,8CAAD,GACZ,sDADD;EAED,KAJe;;;EAMhB,UAAMA,KAAN;EACD,GARL,CADJ;EAWD,CAfD;;EAiBA,MAAMC,gBAAgB,GAAIxF,KAAD,IAAW;EAClC,QAAMqC,kBAAkB,GAAGC,6BAA6B,EAAxD;EACA,QAAM/G,OAAO,GAAGC,eAAe,CAACC,GAAhB,EAAhB;EAEAuE,EAAAA,KAAK,CAACmF,SAAN,CAAgB9C,kBAAkB,CAACrB,QAAnB,CAA4B;EAAChB,IAAAA,KAAD;EAAQzE,IAAAA;EAAR,GAA5B,CAAhB;EACD,CALD;EAOA;;;;;;;;;;;;;;;;;;;;;AAmBA,QAAakK,QAAQ,GAAIpG,OAAD,IAAa;EACnC,QAAMgD,kBAAkB,GAAGC,6BAA6B,EAAxD;EACAD,EAAAA,kBAAkB,CAACjD,cAAnB,CAAkCC,OAAlC;;EAEA,MAAIA,OAAO,CAAClB,MAAR,GAAiB,CAArB,EAAwB;EACtB;EACA;EACA;EACA0F,IAAAA,gBAAgB,CAAC,SAAD,EAAYwB,eAAZ,CAAhB;EACAxB,IAAAA,gBAAgB,CAAC,UAAD,EAAa2B,gBAAb,CAAhB;EACD;EACF,CAXM;;ECzDP;;;;;;;AAQA,EAKA;;;;;;;;;;;;;;;AAcA,QAAaE,gBAAgB,GAAG,CAACrG,OAAD,EAAUmE,OAAV,KAAsB;EACpDiC,EAAAA,QAAQ,CAACpG,OAAD,CAAR;EACAmF,EAAAA,QAAQ,CAAChB,OAAD,CAAR;EACD,CAHM;;EC3BP;;;;;;;AAQA;AAWA,EAA2C;EACzClE,EAAAA,iBAAM,CAACqG,OAAP,CAAe,oBAAf;EACD;;;;;;;;;;;;;;;;"} -\ No newline at end of file -+{"version":3,"file":"workbox-precaching.dev.js","sources":["../_version.mjs","../utils/precachePlugins.mjs","../addPlugins.mjs","../utils/cleanRedirect.mjs","../utils/createCacheKey.mjs","../utils/printCleanupDetails.mjs","../utils/printInstallDetails.mjs","../PrecacheController.mjs","../utils/getOrCreatePrecacheController.mjs","../utils/removeIgnoredSearchParams.mjs","../utils/generateURLVariations.mjs","../utils/getCacheKeyForURL.mjs","../utils/addFetchListener.mjs","../addRoute.mjs","../utils/deleteOutdatedCaches.mjs","../cleanupOutdatedCaches.mjs","../getCacheKeyForURL.mjs","../precache.mjs","../precacheAndRoute.mjs","../index.mjs"],"sourcesContent":["try{self['workbox:precaching:4.3.1']&&_()}catch(e){}// eslint-disable-line","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n\nconst plugins = [];\n\nexport const precachePlugins = {\n /*\n * @return {Array}\n * @private\n */\n get() {\n return plugins;\n },\n\n /*\n * @param {Array} newPlugins\n * @private\n */\n add(newPlugins) {\n plugins.push(...newPlugins);\n },\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds plugins to precaching.\n *\n * @param {Array} newPlugins\n *\n * @alias workbox.precaching.addPlugins\n */\nconst addPlugins = (newPlugins) => {\n precachePlugins.add(newPlugins);\n};\n\nexport {addPlugins};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * @param {Response} response\n * @return {Response}\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport async function cleanRedirect(response) {\n const clonedResponse = response.clone();\n\n // Not all browsers support the Response.body stream, so fall back\n // to reading the entire body into memory as a blob.\n const bodyPromise = 'body' in clonedResponse ?\n Promise.resolve(clonedResponse.body) :\n clonedResponse.blob();\n\n const body = await bodyPromise;\n\n // new Response() is happy when passed either a stream or a Blob.\n return new Response(body, {\n headers: clonedResponse.headers,\n status: clonedResponse.status,\n statusText: clonedResponse.statusText,\n });\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport '../_version.mjs';\n\n// Name of the search parameter used to store revision info.\nconst REVISION_SEARCH_PARAM = '__WB_REVISION__';\n\n/**\n * Converts a manifest entry into a versioned URL suitable for precaching.\n *\n * @param {Object} entry\n * @return {string} A URL with versioning info.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function createCacheKey(entry) {\n if (!entry) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If a precache manifest entry is a string, it's assumed to be a versioned\n // URL, like '/app.abcd1234.js'. Return as-is.\n if (typeof entry === 'string') {\n const urlObject = new URL(entry, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n const {revision, url} = entry;\n if (!url) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If there's just a URL and no revision, then it's also assumed to be a\n // versioned URL.\n if (!revision) {\n const urlObject = new URL(url, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n // Otherwise, construct a properly versioned URL using the custom Workbox\n // search parameter along with the revision info.\n const originalURL = new URL(url, location);\n const cacheKeyURL = new URL(url, location);\n cacheKeyURL.searchParams.set(REVISION_SEARCH_PARAM, revision);\n return {\n cacheKey: cacheKeyURL.href,\n url: originalURL.href,\n };\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\n\nimport '../_version.mjs';\n\nconst logGroup = (groupTitle, deletedURLs) => {\n logger.groupCollapsed(groupTitle);\n\n for (const url of deletedURLs) {\n logger.log(url);\n }\n\n logger.groupEnd();\n};\n\n/**\n * @param {Array} deletedURLs\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function printCleanupDetails(deletedURLs) {\n const deletionCount = deletedURLs.length;\n if (deletionCount > 0) {\n logger.groupCollapsed(`During precaching cleanup, ` +\n `${deletionCount} cached ` +\n `request${deletionCount === 1 ? ' was' : 's were'} deleted.`);\n logGroup('Deleted Cache Requests', deletedURLs);\n logger.groupEnd();\n }\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\n\nimport '../_version.mjs';\n\n/**\n * @param {string} groupTitle\n * @param {Array} urls\n *\n * @private\n */\nfunction _nestedGroup(groupTitle, urls) {\n if (urls.length === 0) {\n return;\n }\n\n logger.groupCollapsed(groupTitle);\n\n for (const url of urls) {\n logger.log(url);\n }\n\n logger.groupEnd();\n}\n\n/**\n * @param {Array} urlsToPrecache\n * @param {Array} urlsAlreadyPrecached\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function printInstallDetails(urlsToPrecache, urlsAlreadyPrecached) {\n const precachedCount = urlsToPrecache.length;\n const alreadyPrecachedCount = urlsAlreadyPrecached.length;\n\n if (precachedCount || alreadyPrecachedCount) {\n let message =\n `Precaching ${precachedCount} file${precachedCount === 1 ? '' : 's'}.`;\n\n if (alreadyPrecachedCount > 0) {\n message += ` ${alreadyPrecachedCount} ` +\n `file${alreadyPrecachedCount === 1 ? ' is' : 's are'} already cached.`;\n }\n\n logger.groupCollapsed(message);\n\n _nestedGroup(`View newly precached URLs.`, urlsToPrecache);\n _nestedGroup(`View previously precached URLs.`, urlsAlreadyPrecached);\n logger.groupEnd();\n }\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {assert} from 'workbox-core/_private/assert.mjs';\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {cacheWrapper} from 'workbox-core/_private/cacheWrapper.mjs';\nimport {fetchWrapper} from 'workbox-core/_private/fetchWrapper.mjs';\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport {cleanRedirect} from './utils/cleanRedirect.mjs';\nimport {createCacheKey} from './utils/createCacheKey.mjs';\nimport {printCleanupDetails} from './utils/printCleanupDetails.mjs';\nimport {printInstallDetails} from './utils/printInstallDetails.mjs';\n\nimport './_version.mjs';\n\n\n/**\n * Performs efficient precaching of assets.\n *\n * @memberof module:workbox-precaching\n */\nclass PrecacheController {\n /**\n * Create a new PrecacheController.\n *\n * @param {string} [cacheName] An optional name for the cache, to override\n * the default precache name.\n */\n constructor(cacheName) {\n this._cacheName = cacheNames.getPrecacheName(cacheName);\n this._urlsToCacheKeys = new Map();\n }\n\n /**\n * This method will add items to the precache list, removing duplicates\n * and ensuring the information is valid.\n *\n * @param {\n * Array\n * } entries Array of entries to precache.\n */\n addToCacheList(entries) {\n if (process.env.NODE_ENV !== 'production') {\n assert.isArray(entries, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'addToCacheList',\n paramName: 'entries',\n });\n }\n\n for (const entry of entries) {\n const {cacheKey, url} = createCacheKey(entry);\n if (this._urlsToCacheKeys.has(url) &&\n this._urlsToCacheKeys.get(url) !== cacheKey) {\n throw new WorkboxError('add-to-cache-list-conflicting-entries', {\n firstEntry: this._urlsToCacheKeys.get(url),\n secondEntry: cacheKey,\n });\n }\n this._urlsToCacheKeys.set(url, cacheKey);\n }\n }\n\n /**\n * Precaches new and updated assets. Call this method from the service worker\n * install event.\n *\n * @param {Object} options\n * @param {Event} [options.event] The install event (if needed).\n * @param {Array} [options.plugins] Plugins to be used for fetching\n * and caching during install.\n * @return {Promise}\n */\n async install({event, plugins} = {}) {\n if (process.env.NODE_ENV !== 'production') {\n if (plugins) {\n assert.isArray(plugins, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'install',\n paramName: 'plugins',\n });\n }\n }\n\n const urlsToPrecache = [];\n const urlsAlreadyPrecached = [];\n\n const cache = await caches.open(this._cacheName);\n const alreadyCachedRequests = await cache.keys();\n const alreadyCachedURLs = new Set(alreadyCachedRequests.map(\n (request) => request.url));\n\n for (const cacheKey of this._urlsToCacheKeys.values()) {\n if (alreadyCachedURLs.has(cacheKey)) {\n urlsAlreadyPrecached.push(cacheKey);\n } else {\n urlsToPrecache.push(cacheKey);\n }\n }\n\n const precacheRequests = urlsToPrecache.map((url) => {\n return this._addURLToCache({event, plugins, url});\n });\n await Promise.all(precacheRequests);\n\n if (process.env.NODE_ENV !== 'production') {\n printInstallDetails(urlsToPrecache, urlsAlreadyPrecached);\n }\n\n return {\n updatedURLs: urlsToPrecache,\n notUpdatedURLs: urlsAlreadyPrecached,\n };\n }\n\n /**\n * Deletes assets that are no longer present in the current precache manifest.\n * Call this method from the service worker activate event.\n *\n * @return {Promise}\n */\n async activate() {\n const cache = await caches.open(this._cacheName);\n const currentlyCachedRequests = await cache.keys();\n const expectedCacheKeys = new Set(this._urlsToCacheKeys.values());\n\n const deletedURLs = [];\n for (const request of currentlyCachedRequests) {\n if (!expectedCacheKeys.has(request.url)) {\n await cache.delete(request);\n deletedURLs.push(request.url);\n }\n }\n\n if (process.env.NODE_ENV !== 'production') {\n printCleanupDetails(deletedURLs);\n }\n\n return {deletedURLs};\n }\n\n /**\n * Requests the entry and saves it to the cache if the response is valid.\n * By default, any response with a status code of less than 400 (including\n * opaque responses) is considered valid.\n *\n * If you need to use custom criteria to determine what's valid and what\n * isn't, then pass in an item in `options.plugins` that implements the\n * `cacheWillUpdate()` lifecycle event.\n *\n * @private\n * @param {Object} options\n * @param {string} options.url The URL to fetch and cache.\n * @param {Event} [options.event] The install event (if passed).\n * @param {Array} [options.plugins] An array of plugins to apply to\n * fetch and caching.\n */\n async _addURLToCache({url, event, plugins}) {\n const request = new Request(url, {credentials: 'same-origin'});\n let response = await fetchWrapper.fetch({\n event,\n plugins,\n request,\n fetchOptions: { importance: 'low'},\n });\n\n // Allow developers to override the default logic about what is and isn't\n // valid by passing in a plugin implementing cacheWillUpdate(), e.g.\n // a workbox.cacheableResponse.Plugin instance.\n let cacheWillUpdateCallback;\n for (const plugin of (plugins || [])) {\n if ('cacheWillUpdate' in plugin) {\n cacheWillUpdateCallback = plugin.cacheWillUpdate.bind(plugin);\n }\n }\n\n const isValidResponse = cacheWillUpdateCallback ?\n // Use a callback if provided. It returns a truthy value if valid.\n cacheWillUpdateCallback({event, request, response}) :\n // Otherwise, default to considering any response status under 400 valid.\n // This includes, by default, considering opaque responses valid.\n response.status < 400;\n\n // Consider this a failure, leading to the `install` handler failing, if\n // we get back an invalid response.\n if (!isValidResponse) {\n throw new WorkboxError('bad-precaching-response', {\n url,\n status: response.status,\n });\n }\n\n if (response.redirected) {\n response = await cleanRedirect(response);\n }\n\n await cacheWrapper.put({\n event,\n plugins,\n request,\n response,\n cacheName: this._cacheName,\n matchOptions: {\n ignoreSearch: true,\n },\n });\n }\n\n /**\n * Returns a mapping of a precached URL to the corresponding cache key, taking\n * into account the revision information for the URL.\n *\n * @return {Map} A URL to cache key mapping.\n */\n getURLsToCacheKeys() {\n return this._urlsToCacheKeys;\n }\n\n /**\n * Returns a list of all the URLs that have been precached by the current\n * service worker.\n *\n * @return {Array} The precached URLs.\n */\n getCachedURLs() {\n return [...this._urlsToCacheKeys.keys()];\n }\n\n /**\n * Returns the cache key used for storing a given URL. If that URL is\n * unversioned, like `/index.html', then the cache key will be the original\n * URL with a search parameter appended to it.\n *\n * @param {string} url A URL whose cache key you want to look up.\n * @return {string} The versioned URL that corresponds to a cache key\n * for the original URL, or undefined if that URL isn't precached.\n */\n getCacheKeyForURL(url) {\n const urlObject = new URL(url, location);\n return this._urlsToCacheKeys.get(urlObject.href);\n }\n}\n\nexport {PrecacheController};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {PrecacheController} from '../PrecacheController.mjs';\nimport '../_version.mjs';\n\n\nlet precacheController;\n\n/**\n * @return {PrecacheController}\n * @private\n */\nexport const getOrCreatePrecacheController = () => {\n if (!precacheController) {\n precacheController = new PrecacheController();\n }\n return precacheController;\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * Removes any URL search parameters that should be ignored.\n *\n * @param {URL} urlObject The original URL.\n * @param {Array} ignoreURLParametersMatching RegExps to test against\n * each search parameter name. Matches mean that the search parameter should be\n * ignored.\n * @return {URL} The URL with any ignored search parameters removed.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function removeIgnoredSearchParams(urlObject,\n ignoreURLParametersMatching) {\n // Convert the iterable into an array at the start of the loop to make sure\n // deletion doesn't mess up iteration.\n for (const paramName of [...urlObject.searchParams.keys()]) {\n if (ignoreURLParametersMatching.some((regExp) => regExp.test(paramName))) {\n urlObject.searchParams.delete(paramName);\n }\n }\n\n return urlObject;\n}\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {removeIgnoredSearchParams} from './removeIgnoredSearchParams.mjs';\n\nimport '../_version.mjs';\n\n/**\n * Generator function that yields possible variations on the original URL to\n * check, one at a time.\n *\n * @param {string} url\n * @param {Object} options\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function* generateURLVariations(url, {\n ignoreURLParametersMatching,\n directoryIndex,\n cleanURLs,\n urlManipulation,\n} = {}) {\n const urlObject = new URL(url, location);\n urlObject.hash = '';\n yield urlObject.href;\n\n const urlWithoutIgnoredParams = removeIgnoredSearchParams(\n urlObject, ignoreURLParametersMatching);\n yield urlWithoutIgnoredParams.href;\n\n if (directoryIndex && urlWithoutIgnoredParams.pathname.endsWith('/')) {\n const directoryURL = new URL(urlWithoutIgnoredParams);\n directoryURL.pathname += directoryIndex;\n yield directoryURL.href;\n }\n\n if (cleanURLs) {\n const cleanURL = new URL(urlWithoutIgnoredParams);\n cleanURL.pathname += '.html';\n yield cleanURL.href;\n }\n\n if (urlManipulation) {\n const additionalURLs = urlManipulation({url: urlObject});\n for (const urlToAttempt of additionalURLs) {\n yield urlToAttempt.href;\n }\n }\n}\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './getOrCreatePrecacheController.mjs';\nimport {generateURLVariations} from './generateURLVariations.mjs';\nimport '../_version.mjs';\n\n/**\n * This function will take the request URL and manipulate it based on the\n * configuration options.\n *\n * @param {string} url\n * @param {Object} options\n * @return {string} Returns the URL in the cache that matches the request,\n * if possible.\n *\n * @private\n */\nexport const getCacheKeyForURL = (url, options) => {\n const precacheController = getOrCreatePrecacheController();\n\n const urlsToCacheKeys = precacheController.getURLsToCacheKeys();\n for (const possibleURL of generateURLVariations(url, options)) {\n const possibleCacheKey = urlsToCacheKeys.get(possibleURL);\n if (possibleCacheKey) {\n return possibleCacheKey;\n }\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {getFriendlyURL} from 'workbox-core/_private/getFriendlyURL.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getCacheKeyForURL} from './getCacheKeyForURL.mjs';\nimport '../_version.mjs';\n\n\n/**\n * Adds a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * NOTE: when called more than once this method will replace the previously set\n * configuration options. Calling it more than once is not recommended outside\n * of tests.\n *\n * @private\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n */\nexport const addFetchListener = ({\n ignoreURLParametersMatching = [/^utm_/],\n directoryIndex = 'index.html',\n cleanURLs = true,\n urlManipulation = null,\n} = {}) => {\n const cacheName = cacheNames.getPrecacheName();\n\n addEventListener('fetch', (event) => {\n const precachedURL = getCacheKeyForURL(event.request.url, {\n cleanURLs,\n directoryIndex,\n ignoreURLParametersMatching,\n urlManipulation,\n });\n if (!precachedURL) {\n if (process.env.NODE_ENV !== 'production') {\n logger.debug(`Precaching did not find a match for ` +\n getFriendlyURL(event.request.url));\n }\n return;\n }\n\n let responsePromise = caches.open(cacheName).then((cache) => {\n return cache.match(precachedURL);\n }).then((cachedResponse) => {\n if (cachedResponse) {\n return cachedResponse;\n }\n\n // Fall back to the network if we don't have a cached response\n // (perhaps due to manual cache cleanup).\n if (process.env.NODE_ENV !== 'production') {\n logger.warn(`The precached response for ` +\n `${getFriendlyURL(precachedURL)} in ${cacheName} was not found. ` +\n `Falling back to the network instead.`);\n }\n\n return fetch(precachedURL);\n });\n\n if (process.env.NODE_ENV !== 'production') {\n responsePromise = responsePromise.then((response) => {\n // Workbox is going to handle the route.\n // print the routing details to the console.\n logger.groupCollapsed(`Precaching is responding to: ` +\n getFriendlyURL(event.request.url));\n logger.log(`Serving the precached url: ${precachedURL}`);\n\n logger.groupCollapsed(`View request details here.`);\n logger.log(event.request);\n logger.groupEnd();\n\n logger.groupCollapsed(`View response details here.`);\n logger.log(response);\n logger.groupEnd();\n\n logger.groupEnd();\n return response;\n });\n }\n\n event.respondWith(responsePromise);\n });\n};\n","\n/*\n Copyright 2019 Google LLC\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addFetchListener} from './utils/addFetchListener.mjs';\nimport './_version.mjs';\n\n\nlet listenerAdded = false;\n\n/**\n * Add a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n *\n * @alias workbox.precaching.addRoute\n */\nexport const addRoute = (options) => {\n if (!listenerAdded) {\n addFetchListener(options);\n listenerAdded = true;\n }\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\nconst SUBSTRING_TO_FIND = '-precache-';\n\n/**\n * Cleans up incompatible precaches that were created by older versions of\n * Workbox, by a service worker registered under the current scope.\n *\n * This is meant to be called as part of the `activate` event.\n *\n * This should be safe to use as long as you don't include `substringToFind`\n * (defaulting to `-precache-`) in your non-precache cache names.\n *\n * @param {string} currentPrecacheName The cache name currently in use for\n * precaching. This cache won't be deleted.\n * @param {string} [substringToFind='-precache-'] Cache names which include this\n * substring will be deleted (excluding `currentPrecacheName`).\n * @return {Array} A list of all the cache names that were deleted.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nconst deleteOutdatedCaches = async (\n currentPrecacheName,\n substringToFind = SUBSTRING_TO_FIND) => {\n const cacheNames = await caches.keys();\n\n const cacheNamesToDelete = cacheNames.filter((cacheName) => {\n return cacheName.includes(substringToFind) &&\n cacheName.includes(self.registration.scope) &&\n cacheName !== currentPrecacheName;\n });\n\n await Promise.all(\n cacheNamesToDelete.map((cacheName) => caches.delete(cacheName)));\n\n return cacheNamesToDelete;\n};\n\nexport {deleteOutdatedCaches};\n\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {deleteOutdatedCaches} from './utils/deleteOutdatedCaches.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds an `activate` event listener which will clean up incompatible\n * precaches that were created by older versions of Workbox.\n *\n * @alias workbox.precaching.cleanupOutdatedCaches\n */\nexport const cleanupOutdatedCaches = () => {\n addEventListener('activate', (event) => {\n const cacheName = cacheNames.getPrecacheName();\n\n event.waitUntil(deleteOutdatedCaches(cacheName).then((cachesDeleted) => {\n if (process.env.NODE_ENV !== 'production') {\n if (cachesDeleted.length > 0) {\n logger.log(`The following out-of-date precaches were cleaned up ` +\n `automatically:`, cachesDeleted);\n }\n }\n }));\n });\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './utils/getOrCreatePrecacheController.mjs';\nimport './_version.mjs';\n\n\n/**\n * Takes in a URL, and returns the corresponding URL that could be used to\n * lookup the entry in the precache.\n *\n * If a relative URL is provided, the location of the service worker file will\n * be used as the base.\n *\n * For precached entries without revision information, the cache key will be the\n * same as the original URL.\n *\n * For precached entries with revision information, the cache key will be the\n * original URL with the addition of a query parameter used for keeping track of\n * the revision info.\n *\n * @param {string} url The URL whose cache key to look up.\n * @return {string} The cache key that corresponds to that URL.\n *\n * @alias workbox.precaching.getCacheKeyForURL\n */\nexport const getCacheKeyForURL = (url) => {\n const precacheController = getOrCreatePrecacheController();\n return precacheController.getCacheKeyForURL(url);\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getOrCreatePrecacheController} from './utils/getOrCreatePrecacheController.mjs';\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\nconst installListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(\n precacheController.install({event, plugins})\n .catch((error) => {\n if (process.env.NODE_ENV !== 'production') {\n logger.error(`Service worker installation failed. It will ` +\n `be retried automatically during the next navigation.`);\n }\n // Re-throw the error to ensure installation fails.\n throw error;\n })\n );\n};\n\nconst activateListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(precacheController.activate({event, plugins}));\n};\n\n/**\n * Adds items to the precache list, removing any duplicates and\n * stores the files in the\n * [\"precache cache\"]{@link module:workbox-core.cacheNames} when the service\n * worker installs.\n *\n * This method can be called multiple times.\n *\n * Please note: This method **will not** serve any of the cached files for you.\n * It only precaches files. To respond to a network request you call\n * [addRoute()]{@link module:workbox-precaching.addRoute}.\n *\n * If you have a single array of files to precache, you can just call\n * [precacheAndRoute()]{@link module:workbox-precaching.precacheAndRoute}.\n *\n * @param {Array} entries Array of entries to precache.\n *\n * @alias workbox.precaching.precache\n */\nexport const precache = (entries) => {\n const precacheController = getOrCreatePrecacheController();\n precacheController.addToCacheList(entries);\n\n if (entries.length > 0) {\n // NOTE: these listeners will only be added once (even if the `precache()`\n // method is called multiple times) because event listeners are implemented\n // as a set, where each listener must be unique.\n addEventListener('install', installListener);\n addEventListener('activate', activateListener);\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addRoute} from './addRoute.mjs';\nimport {precache} from './precache.mjs';\nimport './_version.mjs';\n\n\n/**\n * This method will add entries to the precache list and add a route to\n * respond to fetch events.\n *\n * This is a convenience method that will call\n * [precache()]{@link module:workbox-precaching.precache} and\n * [addRoute()]{@link module:workbox-precaching.addRoute} in a single call.\n *\n * @param {Array} entries Array of entries to precache.\n * @param {Object} options See\n * [addRoute() options]{@link module:workbox-precaching.addRoute}.\n *\n * @alias workbox.precaching.precacheAndRoute\n */\nexport const precacheAndRoute = (entries, options) => {\n precache(entries);\n addRoute(options);\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {assert} from 'workbox-core/_private/assert.mjs';\nimport {addPlugins} from './addPlugins.mjs';\nimport {addRoute} from './addRoute.mjs';\nimport {cleanupOutdatedCaches} from './cleanupOutdatedCaches.mjs';\nimport {getCacheKeyForURL} from './getCacheKeyForURL.mjs';\nimport {precache} from './precache.mjs';\nimport {precacheAndRoute} from './precacheAndRoute.mjs';\nimport {PrecacheController} from './PrecacheController.mjs';\nimport './_version.mjs';\n\n\nif (process.env.NODE_ENV !== 'production') {\n assert.isSWEnv('workbox-precaching');\n}\n\n/**\n * Most consumers of this module will want to use the\n * [precacheAndRoute()]{@link workbox.precaching.precacheAndRoute}\n * method to add assets to the Cache and respond to network requests with these\n * cached assets.\n *\n * If you require finer grained control, you can use the\n * [PrecacheController]{@link workbox.precaching.PrecacheController}\n * to determine when performed.\n *\n * @namespace workbox.precaching\n */\n\nexport {\n addPlugins,\n addRoute,\n cleanupOutdatedCaches,\n getCacheKeyForURL,\n precache,\n precacheAndRoute,\n PrecacheController,\n};\n"],"names":["self","_","e","plugins","precachePlugins","get","add","newPlugins","push","addPlugins","cleanRedirect","response","clonedResponse","clone","bodyPromise","Promise","resolve","body","blob","Response","headers","status","statusText","REVISION_SEARCH_PARAM","createCacheKey","entry","WorkboxError","urlObject","URL","location","cacheKey","href","url","revision","originalURL","cacheKeyURL","searchParams","set","logGroup","groupTitle","deletedURLs","logger","groupCollapsed","log","groupEnd","printCleanupDetails","deletionCount","length","_nestedGroup","urls","printInstallDetails","urlsToPrecache","urlsAlreadyPrecached","precachedCount","alreadyPrecachedCount","message","PrecacheController","constructor","cacheName","_cacheName","cacheNames","getPrecacheName","_urlsToCacheKeys","Map","addToCacheList","entries","assert","isArray","moduleName","className","funcName","paramName","has","firstEntry","secondEntry","install","event","cache","caches","open","alreadyCachedRequests","keys","alreadyCachedURLs","Set","map","request","values","precacheRequests","_addURLToCache","all","updatedURLs","notUpdatedURLs","activate","currentlyCachedRequests","expectedCacheKeys","delete","Request","credentials","fetchWrapper","fetch","fetchOptions","importance","cacheWillUpdateCallback","plugin","cacheWillUpdate","bind","isValidResponse","redirected","cacheWrapper","put","matchOptions","ignoreSearch","getURLsToCacheKeys","getCachedURLs","getCacheKeyForURL","precacheController","getOrCreatePrecacheController","removeIgnoredSearchParams","ignoreURLParametersMatching","some","regExp","test","generateURLVariations","directoryIndex","cleanURLs","urlManipulation","hash","urlWithoutIgnoredParams","pathname","endsWith","directoryURL","cleanURL","additionalURLs","urlToAttempt","options","urlsToCacheKeys","possibleURL","possibleCacheKey","addFetchListener","addEventListener","precachedURL","debug","getFriendlyURL","responsePromise","then","match","cachedResponse","warn","respondWith","listenerAdded","addRoute","SUBSTRING_TO_FIND","deleteOutdatedCaches","currentPrecacheName","substringToFind","cacheNamesToDelete","filter","includes","registration","scope","cleanupOutdatedCaches","waitUntil","cachesDeleted","installListener","catch","error","activateListener","precache","precacheAndRoute","isSWEnv"],"mappings":";;;;EAAA,IAAG;EAACA,EAAAA,IAAI,CAAC,0BAAD,CAAJ,IAAkCC,CAAC,EAAnC;EAAsC,CAA1C,CAA0C,OAAMC,CAAN,EAAQ;;ECAlD;;;;;;;AAQA,EAGA,MAAMC,OAAO,GAAG,EAAhB;AAEA,EAAO,MAAMC,eAAe,GAAG;EAC7B;;;;EAIAC,EAAAA,GAAG,GAAG;EACJ,WAAOF,OAAP;EACD,GAP4B;;EAS7B;;;;EAIAG,EAAAA,GAAG,CAACC,UAAD,EAAa;EACdJ,IAAAA,OAAO,CAACK,IAAR,CAAa,GAAGD,UAAhB;EACD;;EAf4B,CAAxB;;ECbP;;;;;;;AAQA,EAIA;;;;;;;;AAOA,QAAME,UAAU,GAAIF,UAAD,IAAgB;EACjCH,EAAAA,eAAe,CAACE,GAAhB,CAAoBC,UAApB;EACD,CAFD;;ECnBA;;;;;;;AAQA,EAEA;;;;;;;;AAOA,EAAO,eAAeG,aAAf,CAA6BC,QAA7B,EAAuC;EAC5C,QAAMC,cAAc,GAAGD,QAAQ,CAACE,KAAT,EAAvB,CAD4C;EAI5C;;EACA,QAAMC,WAAW,GAAG,UAAUF,cAAV,GAClBG,OAAO,CAACC,OAAR,CAAgBJ,cAAc,CAACK,IAA/B,CADkB,GAElBL,cAAc,CAACM,IAAf,EAFF;EAIA,QAAMD,IAAI,GAAG,MAAMH,WAAnB,CAT4C;;EAY5C,SAAO,IAAIK,QAAJ,CAAaF,IAAb,EAAmB;EACxBG,IAAAA,OAAO,EAAER,cAAc,CAACQ,OADA;EAExBC,IAAAA,MAAM,EAAET,cAAc,CAACS,MAFC;EAGxBC,IAAAA,UAAU,EAAEV,cAAc,CAACU;EAHH,GAAnB,CAAP;EAKD;;EClCD;;;;;;;AAQA;EAKA,MAAMC,qBAAqB,GAAG,iBAA9B;EAEA;;;;;;;;;;AASA,EAAO,SAASC,cAAT,CAAwBC,KAAxB,EAA+B;EACpC,MAAI,CAACA,KAAL,EAAY;EACV,UAAM,IAAIC,6BAAJ,CAAiB,mCAAjB,EAAsD;EAACD,MAAAA;EAAD,KAAtD,CAAN;EACD,GAHmC;EAMpC;;;EACA,MAAI,OAAOA,KAAP,KAAiB,QAArB,EAA+B;EAC7B,UAAME,SAAS,GAAG,IAAIC,GAAJ,CAAQH,KAAR,EAAeI,QAAf,CAAlB;EACA,WAAO;EACLC,MAAAA,QAAQ,EAAEH,SAAS,CAACI,IADf;EAELC,MAAAA,GAAG,EAAEL,SAAS,CAACI;EAFV,KAAP;EAID;;EAED,QAAM;EAACE,IAAAA,QAAD;EAAWD,IAAAA;EAAX,MAAkBP,KAAxB;;EACA,MAAI,CAACO,GAAL,EAAU;EACR,UAAM,IAAIN,6BAAJ,CAAiB,mCAAjB,EAAsD;EAACD,MAAAA;EAAD,KAAtD,CAAN;EACD,GAlBmC;EAqBpC;;;EACA,MAAI,CAACQ,QAAL,EAAe;EACb,UAAMN,SAAS,GAAG,IAAIC,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAAlB;EACA,WAAO;EACLC,MAAAA,QAAQ,EAAEH,SAAS,CAACI,IADf;EAELC,MAAAA,GAAG,EAAEL,SAAS,CAACI;EAFV,KAAP;EAID,GA5BmC;EA+BpC;;;EACA,QAAMG,WAAW,GAAG,IAAIN,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAApB;EACA,QAAMM,WAAW,GAAG,IAAIP,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAApB;EACAM,EAAAA,WAAW,CAACC,YAAZ,CAAyBC,GAAzB,CAA6Bd,qBAA7B,EAAoDU,QAApD;EACA,SAAO;EACLH,IAAAA,QAAQ,EAAEK,WAAW,CAACJ,IADjB;EAELC,IAAAA,GAAG,EAAEE,WAAW,CAACH;EAFZ,GAAP;EAID;;EC/DD;;;;;;;AAQA;EAIA,MAAMO,QAAQ,GAAG,CAACC,UAAD,EAAaC,WAAb,KAA6B;EAC5CC,EAAAA,iBAAM,CAACC,cAAP,CAAsBH,UAAtB;;EAEA,OAAK,MAAMP,GAAX,IAAkBQ,WAAlB,EAA+B;EAC7BC,IAAAA,iBAAM,CAACE,GAAP,CAAWX,GAAX;EACD;;EAEDS,EAAAA,iBAAM,CAACG,QAAP;EACD,CARD;EAUA;;;;;;;;AAMA,EAAO,SAASC,mBAAT,CAA6BL,WAA7B,EAA0C;EAC/C,QAAMM,aAAa,GAAGN,WAAW,CAACO,MAAlC;;EACA,MAAID,aAAa,GAAG,CAApB,EAAuB;EACrBL,IAAAA,iBAAM,CAACC,cAAP,CAAuB,6BAAD,GACjB,GAAEI,aAAc,UADC,GAEjB,UAASA,aAAa,KAAK,CAAlB,GAAsB,MAAtB,GAA+B,QAAS,WAFtD;EAGAR,IAAAA,QAAQ,CAAC,wBAAD,EAA2BE,WAA3B,CAAR;EACAC,IAAAA,iBAAM,CAACG,QAAP;EACD;EACF;;ECrCD;;;;;;;AAQA,EAIA;;;;;;;EAMA,SAASI,YAAT,CAAsBT,UAAtB,EAAkCU,IAAlC,EAAwC;EACtC,MAAIA,IAAI,CAACF,MAAL,KAAgB,CAApB,EAAuB;EACrB;EACD;;EAEDN,EAAAA,iBAAM,CAACC,cAAP,CAAsBH,UAAtB;;EAEA,OAAK,MAAMP,GAAX,IAAkBiB,IAAlB,EAAwB;EACtBR,IAAAA,iBAAM,CAACE,GAAP,CAAWX,GAAX;EACD;;EAEDS,EAAAA,iBAAM,CAACG,QAAP;EACD;EAED;;;;;;;;;AAOA,EAAO,SAASM,mBAAT,CAA6BC,cAA7B,EAA6CC,oBAA7C,EAAmE;EACxE,QAAMC,cAAc,GAAGF,cAAc,CAACJ,MAAtC;EACA,QAAMO,qBAAqB,GAAGF,oBAAoB,CAACL,MAAnD;;EAEA,MAAIM,cAAc,IAAIC,qBAAtB,EAA6C;EAC3C,QAAIC,OAAO,GACN,cAAaF,cAAe,QAAOA,cAAc,KAAK,CAAnB,GAAuB,EAAvB,GAA4B,GAAI,GADxE;;EAGA,QAAIC,qBAAqB,GAAG,CAA5B,EAA+B;EAC7BC,MAAAA,OAAO,IAAK,IAAGD,qBAAsB,GAA1B,GACR,OAAMA,qBAAqB,KAAK,CAA1B,GAA8B,KAA9B,GAAsC,OAAQ,kBADvD;EAED;;EAEDb,IAAAA,iBAAM,CAACC,cAAP,CAAsBa,OAAtB;;EAEAP,IAAAA,YAAY,CAAE,4BAAF,EAA+BG,cAA/B,CAAZ;;EACAH,IAAAA,YAAY,CAAE,iCAAF,EAAoCI,oBAApC,CAAZ;;EACAX,IAAAA,iBAAM,CAACG,QAAP;EACD;EACF;;EC1DD;;;;;;;AAQA,EAcA;;;;;;EAKA,MAAMY,kBAAN,CAAyB;EACvB;;;;;;EAMAC,EAAAA,WAAW,CAACC,SAAD,EAAY;EACrB,SAAKC,UAAL,GAAkBC,yBAAU,CAACC,eAAX,CAA2BH,SAA3B,CAAlB;EACA,SAAKI,gBAAL,GAAwB,IAAIC,GAAJ,EAAxB;EACD;EAED;;;;;;;;;;EAQAC,EAAAA,cAAc,CAACC,OAAD,EAAU;EACtB,IAA2C;EACzCC,MAAAA,iBAAM,CAACC,OAAP,CAAeF,OAAf,EAAwB;EACtBG,QAAAA,UAAU,EAAE,oBADU;EAEtBC,QAAAA,SAAS,EAAE,oBAFW;EAGtBC,QAAAA,QAAQ,EAAE,gBAHY;EAItBC,QAAAA,SAAS,EAAE;EAJW,OAAxB;EAMD;;EAED,SAAK,MAAM9C,KAAX,IAAoBwC,OAApB,EAA6B;EAC3B,YAAM;EAACnC,QAAAA,QAAD;EAAWE,QAAAA;EAAX,UAAkBR,cAAc,CAACC,KAAD,CAAtC;;EACA,UAAI,KAAKqC,gBAAL,CAAsBU,GAAtB,CAA0BxC,GAA1B,KACA,KAAK8B,gBAAL,CAAsBzD,GAAtB,CAA0B2B,GAA1B,MAAmCF,QADvC,EACiD;EAC/C,cAAM,IAAIJ,6BAAJ,CAAiB,uCAAjB,EAA0D;EAC9D+C,UAAAA,UAAU,EAAE,KAAKX,gBAAL,CAAsBzD,GAAtB,CAA0B2B,GAA1B,CADkD;EAE9D0C,UAAAA,WAAW,EAAE5C;EAFiD,SAA1D,CAAN;EAID;;EACD,WAAKgC,gBAAL,CAAsBzB,GAAtB,CAA0BL,GAA1B,EAA+BF,QAA/B;EACD;EACF;EAED;;;;;;;;;;;;EAUA,QAAM6C,OAAN,CAAc;EAACC,IAAAA,KAAD;EAAQzE,IAAAA;EAAR,MAAmB,EAAjC,EAAqC;EACnC,IAA2C;EACzC,UAAIA,OAAJ,EAAa;EACX+D,QAAAA,iBAAM,CAACC,OAAP,CAAehE,OAAf,EAAwB;EACtBiE,UAAAA,UAAU,EAAE,oBADU;EAEtBC,UAAAA,SAAS,EAAE,oBAFW;EAGtBC,UAAAA,QAAQ,EAAE,SAHY;EAItBC,UAAAA,SAAS,EAAE;EAJW,SAAxB;EAMD;EACF;;EAED,UAAMpB,cAAc,GAAG,EAAvB;EACA,UAAMC,oBAAoB,GAAG,EAA7B;EAEA,UAAMyB,KAAK,GAAG,MAAMC,MAAM,CAACC,IAAP,CAAY,KAAKpB,UAAjB,CAApB;EACA,UAAMqB,qBAAqB,GAAG,MAAMH,KAAK,CAACI,IAAN,EAApC;EACA,UAAMC,iBAAiB,GAAG,IAAIC,GAAJ,CAAQH,qBAAqB,CAACI,GAAtB,CAC7BC,OAAD,IAAaA,OAAO,CAACrD,GADS,CAAR,CAA1B;;EAGA,SAAK,MAAMF,QAAX,IAAuB,KAAKgC,gBAAL,CAAsBwB,MAAtB,EAAvB,EAAuD;EACrD,UAAIJ,iBAAiB,CAACV,GAAlB,CAAsB1C,QAAtB,CAAJ,EAAqC;EACnCsB,QAAAA,oBAAoB,CAAC5C,IAArB,CAA0BsB,QAA1B;EACD,OAFD,MAEO;EACLqB,QAAAA,cAAc,CAAC3C,IAAf,CAAoBsB,QAApB;EACD;EACF;;EAED,UAAMyD,gBAAgB,GAAGpC,cAAc,CAACiC,GAAf,CAAoBpD,GAAD,IAAS;EACnD,aAAO,KAAKwD,cAAL,CAAoB;EAACZ,QAAAA,KAAD;EAAQzE,QAAAA,OAAR;EAAiB6B,QAAAA;EAAjB,OAApB,CAAP;EACD,KAFwB,CAAzB;EAGA,UAAMjB,OAAO,CAAC0E,GAAR,CAAYF,gBAAZ,CAAN;;EAEA,IAA2C;EACzCrC,MAAAA,mBAAmB,CAACC,cAAD,EAAiBC,oBAAjB,CAAnB;EACD;;EAED,WAAO;EACLsC,MAAAA,WAAW,EAAEvC,cADR;EAELwC,MAAAA,cAAc,EAAEvC;EAFX,KAAP;EAID;EAED;;;;;;;;EAMA,QAAMwC,QAAN,GAAiB;EACf,UAAMf,KAAK,GAAG,MAAMC,MAAM,CAACC,IAAP,CAAY,KAAKpB,UAAjB,CAApB;EACA,UAAMkC,uBAAuB,GAAG,MAAMhB,KAAK,CAACI,IAAN,EAAtC;EACA,UAAMa,iBAAiB,GAAG,IAAIX,GAAJ,CAAQ,KAAKrB,gBAAL,CAAsBwB,MAAtB,EAAR,CAA1B;EAEA,UAAM9C,WAAW,GAAG,EAApB;;EACA,SAAK,MAAM6C,OAAX,IAAsBQ,uBAAtB,EAA+C;EAC7C,UAAI,CAACC,iBAAiB,CAACtB,GAAlB,CAAsBa,OAAO,CAACrD,GAA9B,CAAL,EAAyC;EACvC,cAAM6C,KAAK,CAACkB,MAAN,CAAaV,OAAb,CAAN;EACA7C,QAAAA,WAAW,CAAChC,IAAZ,CAAiB6E,OAAO,CAACrD,GAAzB;EACD;EACF;;EAED,IAA2C;EACzCa,MAAAA,mBAAmB,CAACL,WAAD,CAAnB;EACD;;EAED,WAAO;EAACA,MAAAA;EAAD,KAAP;EACD;EAED;;;;;;;;;;;;;;;;;;EAgBA,QAAMgD,cAAN,CAAqB;EAACxD,IAAAA,GAAD;EAAM4C,IAAAA,KAAN;EAAazE,IAAAA;EAAb,GAArB,EAA4C;EAC1C,UAAMkF,OAAO,GAAG,IAAIW,OAAJ,CAAYhE,GAAZ,EAAiB;EAACiE,MAAAA,WAAW,EAAE;EAAd,KAAjB,CAAhB;EACA,QAAItF,QAAQ,GAAG,MAAMuF,6BAAY,CAACC,KAAb,CAAmB;EACtCvB,MAAAA,KADsC;EAEtCzE,MAAAA,OAFsC;EAGtCkF,MAAAA,OAHsC;EAItCe,MAAAA,YAAY,EAAE;EAAEC,QAAAA,UAAU,EAAE;EAAd;EAJwB,KAAnB,CAArB,CAF0C;EAU1C;EACA;;EACA,QAAIC,uBAAJ;;EACA,SAAK,MAAMC,MAAX,IAAsBpG,OAAO,IAAI,EAAjC,EAAsC;EACpC,UAAI,qBAAqBoG,MAAzB,EAAiC;EAC/BD,QAAAA,uBAAuB,GAAGC,MAAM,CAACC,eAAP,CAAuBC,IAAvB,CAA4BF,MAA5B,CAA1B;EACD;EACF;;EAED,UAAMG,eAAe,GAAGJ,uBAAuB;EAE7CA,IAAAA,uBAAuB,CAAC;EAAC1B,MAAAA,KAAD;EAAQS,MAAAA,OAAR;EAAiB1E,MAAAA;EAAjB,KAAD,CAFsB;EAI7C;EACAA,IAAAA,QAAQ,CAACU,MAAT,GAAkB,GALpB,CAnB0C;EA2B1C;;EACA,QAAI,CAACqF,eAAL,EAAsB;EACpB,YAAM,IAAIhF,6BAAJ,CAAiB,yBAAjB,EAA4C;EAChDM,QAAAA,GADgD;EAEhDX,QAAAA,MAAM,EAAEV,QAAQ,CAACU;EAF+B,OAA5C,CAAN;EAID;;EAED,QAAIV,QAAQ,CAACgG,UAAb,EAAyB;EACvBhG,MAAAA,QAAQ,GAAG,MAAMD,aAAa,CAACC,QAAD,CAA9B;EACD;;EAED,UAAMiG,6BAAY,CAACC,GAAb,CAAiB;EACrBjC,MAAAA,KADqB;EAErBzE,MAAAA,OAFqB;EAGrBkF,MAAAA,OAHqB;EAIrB1E,MAAAA,QAJqB;EAKrB+C,MAAAA,SAAS,EAAE,KAAKC,UALK;EAMrBmD,MAAAA,YAAY,EAAE;EACZC,QAAAA,YAAY,EAAE;EADF;EANO,KAAjB,CAAN;EAUD;EAED;;;;;;;;EAMAC,EAAAA,kBAAkB,GAAG;EACnB,WAAO,KAAKlD,gBAAZ;EACD;EAED;;;;;;;;EAMAmD,EAAAA,aAAa,GAAG;EACd,WAAO,CAAC,GAAG,KAAKnD,gBAAL,CAAsBmB,IAAtB,EAAJ,CAAP;EACD;EAED;;;;;;;;;;;EASAiC,EAAAA,iBAAiB,CAAClF,GAAD,EAAM;EACrB,UAAML,SAAS,GAAG,IAAIC,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAAlB;EACA,WAAO,KAAKiC,gBAAL,CAAsBzD,GAAtB,CAA0BsB,SAAS,CAACI,IAApC,CAAP;EACD;;EA7NsB;;EC3BzB;;;;;;;AAQA,EAIA,IAAIoF,kBAAJ;EAEA;;;;;AAIA,EAAO,MAAMC,6BAA6B,GAAG,MAAM;EACjD,MAAI,CAACD,kBAAL,EAAyB;EACvBA,IAAAA,kBAAkB,GAAG,IAAI3D,kBAAJ,EAArB;EACD;;EACD,SAAO2D,kBAAP;EACD,CALM;;EClBP;;;;;;;AAQA,EAEA;;;;;;;;;;;;;AAYA,EAAO,SAASE,yBAAT,CAAmC1F,SAAnC,EACH2F,2BADG,EAC0B;EAC/B;EACA;EACA,OAAK,MAAM/C,SAAX,IAAwB,CAAC,GAAG5C,SAAS,CAACS,YAAV,CAAuB6C,IAAvB,EAAJ,CAAxB,EAA4D;EAC1D,QAAIqC,2BAA2B,CAACC,IAA5B,CAAkCC,MAAD,IAAYA,MAAM,CAACC,IAAP,CAAYlD,SAAZ,CAA7C,CAAJ,EAA0E;EACxE5C,MAAAA,SAAS,CAACS,YAAV,CAAuB2D,MAAvB,CAA8BxB,SAA9B;EACD;EACF;;EAED,SAAO5C,SAAP;EACD;;ECjCD;;;;;;;AAQA,EAIA;;;;;;;;;;;AAUA,EAAO,UAAU+F,qBAAV,CAAgC1F,GAAhC,EAAqC;EAC1CsF,EAAAA,2BAD0C;EAE1CK,EAAAA,cAF0C;EAG1CC,EAAAA,SAH0C;EAI1CC,EAAAA;EAJ0C,IAKxC,EALG,EAKC;EACN,QAAMlG,SAAS,GAAG,IAAIC,GAAJ,CAAQI,GAAR,EAAaH,QAAb,CAAlB;EACAF,EAAAA,SAAS,CAACmG,IAAV,GAAiB,EAAjB;EACA,QAAMnG,SAAS,CAACI,IAAhB;EAEA,QAAMgG,uBAAuB,GAAGV,yBAAyB,CACrD1F,SADqD,EAC1C2F,2BAD0C,CAAzD;EAEA,QAAMS,uBAAuB,CAAChG,IAA9B;;EAEA,MAAI4F,cAAc,IAAII,uBAAuB,CAACC,QAAxB,CAAiCC,QAAjC,CAA0C,GAA1C,CAAtB,EAAsE;EACpE,UAAMC,YAAY,GAAG,IAAItG,GAAJ,CAAQmG,uBAAR,CAArB;EACAG,IAAAA,YAAY,CAACF,QAAb,IAAyBL,cAAzB;EACA,UAAMO,YAAY,CAACnG,IAAnB;EACD;;EAED,MAAI6F,SAAJ,EAAe;EACb,UAAMO,QAAQ,GAAG,IAAIvG,GAAJ,CAAQmG,uBAAR,CAAjB;EACAI,IAAAA,QAAQ,CAACH,QAAT,IAAqB,OAArB;EACA,UAAMG,QAAQ,CAACpG,IAAf;EACD;;EAED,MAAI8F,eAAJ,EAAqB;EACnB,UAAMO,cAAc,GAAGP,eAAe,CAAC;EAAC7F,MAAAA,GAAG,EAAEL;EAAN,KAAD,CAAtC;;EACA,SAAK,MAAM0G,YAAX,IAA2BD,cAA3B,EAA2C;EACzC,YAAMC,YAAY,CAACtG,IAAnB;EACD;EACF;EACF;;ECtDD;;;;;;;AAQA,EAKA;;;;;;;;;;;;AAWA,EAAO,MAAMmF,iBAAiB,GAAG,CAAClF,GAAD,EAAMsG,OAAN,KAAkB;EACjD,QAAMnB,kBAAkB,GAAGC,6BAA6B,EAAxD;EAEA,QAAMmB,eAAe,GAAGpB,kBAAkB,CAACH,kBAAnB,EAAxB;;EACA,OAAK,MAAMwB,WAAX,IAA0Bd,qBAAqB,CAAC1F,GAAD,EAAMsG,OAAN,CAA/C,EAA+D;EAC7D,UAAMG,gBAAgB,GAAGF,eAAe,CAAClI,GAAhB,CAAoBmI,WAApB,CAAzB;;EACA,QAAIC,gBAAJ,EAAsB;EACpB,aAAOA,gBAAP;EACD;EACF;EACF,CAVM;;ECxBP;;;;;;;AAQA,EAOA;;;;;;;;;;;;;;;;;;;;;;;;;;;;AA2BA,EAAO,MAAMC,gBAAgB,GAAG,CAAC;EAC/BpB,EAAAA,2BAA2B,GAAG,CAAC,OAAD,CADC;EAE/BK,EAAAA,cAAc,GAAG,YAFc;EAG/BC,EAAAA,SAAS,GAAG,IAHmB;EAI/BC,EAAAA,eAAe,GAAG;EAJa,IAK7B,EAL4B,KAKrB;EACT,QAAMnE,SAAS,GAAGE,yBAAU,CAACC,eAAX,EAAlB;EAEA8E,EAAAA,gBAAgB,CAAC,OAAD,EAAW/D,KAAD,IAAW;EACnC,UAAMgE,YAAY,GAAG1B,iBAAiB,CAACtC,KAAK,CAACS,OAAN,CAAcrD,GAAf,EAAoB;EACxD4F,MAAAA,SADwD;EAExDD,MAAAA,cAFwD;EAGxDL,MAAAA,2BAHwD;EAIxDO,MAAAA;EAJwD,KAApB,CAAtC;;EAMA,QAAI,CAACe,YAAL,EAAmB;EACjB,MAA2C;EACzCnG,QAAAA,iBAAM,CAACoG,KAAP,CAAc,sCAAD,GACXC,iCAAc,CAAClE,KAAK,CAACS,OAAN,CAAcrD,GAAf,CADhB;EAED;;EACD;EACD;;EAED,QAAI+G,eAAe,GAAGjE,MAAM,CAACC,IAAP,CAAYrB,SAAZ,EAAuBsF,IAAvB,CAA6BnE,KAAD,IAAW;EAC3D,aAAOA,KAAK,CAACoE,KAAN,CAAYL,YAAZ,CAAP;EACD,KAFqB,EAEnBI,IAFmB,CAEbE,cAAD,IAAoB;EAC1B,UAAIA,cAAJ,EAAoB;EAClB,eAAOA,cAAP;EACD,OAHyB;EAM1B;;;EACA,MAA2C;EACzCzG,QAAAA,iBAAM,CAAC0G,IAAP,CAAa,6BAAD,GACX,GAAEL,iCAAc,CAACF,YAAD,CAAe,OAAMlF,SAAU,kBADpC,GAEX,sCAFD;EAGD;;EAED,aAAOyC,KAAK,CAACyC,YAAD,CAAZ;EACD,KAhBqB,CAAtB;;EAkBA,IAA2C;EACzCG,MAAAA,eAAe,GAAGA,eAAe,CAACC,IAAhB,CAAsBrI,QAAD,IAAc;EACnD;EACA;EACA8B,QAAAA,iBAAM,CAACC,cAAP,CAAuB,+BAAD,GACpBoG,iCAAc,CAAClE,KAAK,CAACS,OAAN,CAAcrD,GAAf,CADhB;EAEAS,QAAAA,iBAAM,CAACE,GAAP,CAAY,8BAA6BiG,YAAa,EAAtD;EAEAnG,QAAAA,iBAAM,CAACC,cAAP,CAAuB,4BAAvB;EACAD,QAAAA,iBAAM,CAACE,GAAP,CAAWiC,KAAK,CAACS,OAAjB;EACA5C,QAAAA,iBAAM,CAACG,QAAP;EAEAH,QAAAA,iBAAM,CAACC,cAAP,CAAuB,6BAAvB;EACAD,QAAAA,iBAAM,CAACE,GAAP,CAAWhC,QAAX;EACA8B,QAAAA,iBAAM,CAACG,QAAP;EAEAH,QAAAA,iBAAM,CAACG,QAAP;EACA,eAAOjC,QAAP;EACD,OAjBiB,CAAlB;EAkBD;;EAEDiE,IAAAA,KAAK,CAACwE,WAAN,CAAkBL,eAAlB;EACD,GAvDe,CAAhB;EAwDD,CAhEM;;ECzCP;;;;;;AAOA,EAIA,IAAIM,aAAa,GAAG,KAApB;EAEA;;;;;;;;;;;;;;;;;;;;;;;;;AAwBA,QAAaC,QAAQ,GAAIhB,OAAD,IAAa;EACnC,MAAI,CAACe,aAAL,EAAoB;EAClBX,IAAAA,gBAAgB,CAACJ,OAAD,CAAhB;EACAe,IAAAA,aAAa,GAAG,IAAhB;EACD;EACF,CALM;;ECtCP;;;;;;;AAQA,EAEA,MAAME,iBAAiB,GAAG,YAA1B;EAEA;;;;;;;;;;;;;;;;;;;EAkBA,MAAMC,oBAAoB,GAAG,OAC3BC,mBAD2B,EAE3BC,eAAe,GAAGH,iBAFS,KAEa;EACxC,QAAM3F,UAAU,GAAG,MAAMkB,MAAM,CAACG,IAAP,EAAzB;EAEA,QAAM0E,kBAAkB,GAAG/F,UAAU,CAACgG,MAAX,CAAmBlG,SAAD,IAAe;EAC1D,WAAOA,SAAS,CAACmG,QAAV,CAAmBH,eAAnB,KACAhG,SAAS,CAACmG,QAAV,CAAmB7J,IAAI,CAAC8J,YAAL,CAAkBC,KAArC,CADA,IAEArG,SAAS,KAAK+F,mBAFrB;EAGD,GAJ0B,CAA3B;EAMA,QAAM1I,OAAO,CAAC0E,GAAR,CACFkE,kBAAkB,CAACvE,GAAnB,CAAwB1B,SAAD,IAAeoB,MAAM,CAACiB,MAAP,CAAcrC,SAAd,CAAtC,CADE,CAAN;EAGA,SAAOiG,kBAAP;EACD,CAfD;;EC9BA;;;;;;;AAQA,EAMA;;;;;;;AAMA,QAAaK,qBAAqB,GAAG,MAAM;EACzCrB,EAAAA,gBAAgB,CAAC,UAAD,EAAc/D,KAAD,IAAW;EACtC,UAAMlB,SAAS,GAAGE,yBAAU,CAACC,eAAX,EAAlB;EAEAe,IAAAA,KAAK,CAACqF,SAAN,CAAgBT,oBAAoB,CAAC9F,SAAD,CAApB,CAAgCsF,IAAhC,CAAsCkB,aAAD,IAAmB;EACtE,MAA2C;EACzC,YAAIA,aAAa,CAACnH,MAAd,GAAuB,CAA3B,EAA8B;EAC5BN,UAAAA,iBAAM,CAACE,GAAP,CAAY,sDAAD,GACN,gBADL,EACsBuH,aADtB;EAED;EACF;EACF,KAPe,CAAhB;EAQD,GAXe,CAAhB;EAYD,CAbM;;ECpBP;;;;;;;AAQA,EAKA;;;;;;;;;;;;;;;;;;;;AAmBA,QAAahD,mBAAiB,GAAIlF,GAAD,IAAS;EACxC,QAAMmF,kBAAkB,GAAGC,6BAA6B,EAAxD;EACA,SAAOD,kBAAkB,CAACD,iBAAnB,CAAqClF,GAArC,CAAP;EACD,CAHM;;EChCP;;;;;;;AAQA;EAMA,MAAMmI,eAAe,GAAIvF,KAAD,IAAW;EACjC,QAAMuC,kBAAkB,GAAGC,6BAA6B,EAAxD;EACA,QAAMjH,OAAO,GAAGC,eAAe,CAACC,GAAhB,EAAhB;EAEAuE,EAAAA,KAAK,CAACqF,SAAN,CACI9C,kBAAkB,CAACxC,OAAnB,CAA2B;EAACC,IAAAA,KAAD;EAAQzE,IAAAA;EAAR,GAA3B,EACKiK,KADL,CACYC,KAAD,IAAW;EAChB,IAA2C;EACzC5H,MAAAA,iBAAM,CAAC4H,KAAP,CAAc,8CAAD,GACZ,sDADD;EAED,KAJe;;;EAMhB,UAAMA,KAAN;EACD,GARL,CADJ;EAWD,CAfD;;EAiBA,MAAMC,gBAAgB,GAAI1F,KAAD,IAAW;EAClC,QAAMuC,kBAAkB,GAAGC,6BAA6B,EAAxD;EACA,QAAMjH,OAAO,GAAGC,eAAe,CAACC,GAAhB,EAAhB;EAEAuE,EAAAA,KAAK,CAACqF,SAAN,CAAgB9C,kBAAkB,CAACvB,QAAnB,CAA4B;EAAChB,IAAAA,KAAD;EAAQzE,IAAAA;EAAR,GAA5B,CAAhB;EACD,CALD;EAOA;;;;;;;;;;;;;;;;;;;;;AAmBA,QAAaoK,QAAQ,GAAItG,OAAD,IAAa;EACnC,QAAMkD,kBAAkB,GAAGC,6BAA6B,EAAxD;EACAD,EAAAA,kBAAkB,CAACnD,cAAnB,CAAkCC,OAAlC;;EAEA,MAAIA,OAAO,CAAClB,MAAR,GAAiB,CAArB,EAAwB;EACtB;EACA;EACA;EACA4F,IAAAA,gBAAgB,CAAC,SAAD,EAAYwB,eAAZ,CAAhB;EACAxB,IAAAA,gBAAgB,CAAC,UAAD,EAAa2B,gBAAb,CAAhB;EACD;EACF,CAXM;;ECzDP;;;;;;;AAQA,EAKA;;;;;;;;;;;;;;;AAcA,QAAaE,gBAAgB,GAAG,CAACvG,OAAD,EAAUqE,OAAV,KAAsB;EACpDiC,EAAAA,QAAQ,CAACtG,OAAD,CAAR;EACAqF,EAAAA,QAAQ,CAAChB,OAAD,CAAR;EACD,CAHM;;EC3BP;;;;;;;AAQA;AAWA,EAA2C;EACzCpE,EAAAA,iBAAM,CAACuG,OAAP,CAAe,oBAAf;EACD;;;;;;;;;;;;;;;;"} -\ No newline at end of file -diff --git a/node_modules/workbox-precaching/build/workbox-precaching.prod.js b/node_modules/workbox-precaching/build/workbox-precaching.prod.js -index 6521788..339182e 100644 ---- a/node_modules/workbox-precaching/build/workbox-precaching.prod.js -+++ b/node_modules/workbox-precaching/build/workbox-precaching.prod.js -@@ -1,2 +1,2 @@ --this.workbox=this.workbox||{},this.workbox.precaching=function(t,e,n,s,c){"use strict";try{self["workbox:precaching:4.3.1"]&&_()}catch(t){}const o=[],i={get:()=>o,add(t){o.push(...t)}};const a="__WB_REVISION__";function r(t){if(!t)throw new c.WorkboxError("add-to-cache-list-unexpected-type",{entry:t});if("string"==typeof t){const e=new URL(t,location);return{cacheKey:e.href,url:e.href}}const{revision:e,url:n}=t;if(!n)throw new c.WorkboxError("add-to-cache-list-unexpected-type",{entry:t});if(!e){const t=new URL(n,location);return{cacheKey:t.href,url:t.href}}const s=new URL(n,location),o=new URL(n,location);return o.searchParams.set(a,e),{cacheKey:o.href,url:s.href}}class l{constructor(t){this.t=e.cacheNames.getPrecacheName(t),this.s=new Map}addToCacheList(t){for(const e of t){const{cacheKey:t,url:n}=r(e);if(this.s.has(n)&&this.s.get(n)!==t)throw new c.WorkboxError("add-to-cache-list-conflicting-entries",{firstEntry:this.s.get(n),secondEntry:t});this.s.set(n,t)}}async install({event:t,plugins:e}={}){const n=[],s=[],c=await caches.open(this.t),o=await c.keys(),i=new Set(o.map(t=>t.url));for(const t of this.s.values())i.has(t)?s.push(t):n.push(t);const a=n.map(n=>this.o({event:t,plugins:e,url:n}));return await Promise.all(a),{updatedURLs:n,notUpdatedURLs:s}}async activate(){const t=await caches.open(this.t),e=await t.keys(),n=new Set(this.s.values()),s=[];for(const c of e)n.has(c.url)||(await t.delete(c),s.push(c.url));return{deletedURLs:s}}async o({url:t,event:e,plugins:o}){const i=new Request(t,{credentials:"same-origin"});let a,r=await s.fetchWrapper.fetch({event:e,plugins:o,request:i});for(const t of o||[])"cacheWillUpdate"in t&&(a=t.cacheWillUpdate.bind(t));if(!(a?a({event:e,request:i,response:r}):r.status<400))throw new c.WorkboxError("bad-precaching-response",{url:t,status:r.status});r.redirected&&(r=await async function(t){const e=t.clone(),n="body"in e?Promise.resolve(e.body):e.blob(),s=await n;return new Response(s,{headers:e.headers,status:e.status,statusText:e.statusText})}(r)),await n.cacheWrapper.put({event:e,plugins:o,request:i,response:r,cacheName:this.t,matchOptions:{ignoreSearch:!0}})}getURLsToCacheKeys(){return this.s}getCachedURLs(){return[...this.s.keys()]}getCacheKeyForURL(t){const e=new URL(t,location);return this.s.get(e.href)}}let u;const h=()=>(u||(u=new l),u);const d=(t,e)=>{const n=h().getURLsToCacheKeys();for(const s of function*(t,{ignoreURLParametersMatching:e,directoryIndex:n,cleanURLs:s,urlManipulation:c}={}){const o=new URL(t,location);o.hash="",yield o.href;const i=function(t,e){for(const n of[...t.searchParams.keys()])e.some(t=>t.test(n))&&t.searchParams.delete(n);return t}(o,e);if(yield i.href,n&&i.pathname.endsWith("/")){const t=new URL(i);t.pathname+=n,yield t.href}if(s){const t=new URL(i);t.pathname+=".html",yield t.href}if(c){const t=c({url:o});for(const e of t)yield e.href}}(t,e)){const t=n.get(s);if(t)return t}};let w=!1;const f=t=>{w||((({ignoreURLParametersMatching:t=[/^utm_/],directoryIndex:n="index.html",cleanURLs:s=!0,urlManipulation:c=null}={})=>{const o=e.cacheNames.getPrecacheName();addEventListener("fetch",e=>{const i=d(e.request.url,{cleanURLs:s,directoryIndex:n,ignoreURLParametersMatching:t,urlManipulation:c});if(!i)return;let a=caches.open(o).then(t=>t.match(i)).then(t=>t||fetch(i));e.respondWith(a)})})(t),w=!0)},y=t=>{const e=h(),n=i.get();t.waitUntil(e.install({event:t,plugins:n}).catch(t=>{throw t}))},p=t=>{const e=h(),n=i.get();t.waitUntil(e.activate({event:t,plugins:n}))},L=t=>{h().addToCacheList(t),t.length>0&&(addEventListener("install",y),addEventListener("activate",p))};return t.addPlugins=(t=>{i.add(t)}),t.addRoute=f,t.cleanupOutdatedCaches=(()=>{addEventListener("activate",t=>{const n=e.cacheNames.getPrecacheName();t.waitUntil((async(t,e="-precache-")=>{const n=(await caches.keys()).filter(n=>n.includes(e)&&n.includes(self.registration.scope)&&n!==t);return await Promise.all(n.map(t=>caches.delete(t))),n})(n).then(t=>{}))})}),t.getCacheKeyForURL=(t=>{return h().getCacheKeyForURL(t)}),t.precache=L,t.precacheAndRoute=((t,e)=>{L(t),f(e)}),t.PrecacheController=l,t}({},workbox.core._private,workbox.core._private,workbox.core._private,workbox.core._private); -+this.workbox=this.workbox||{},this.workbox.precaching=function(t,e,n,s,c){"use strict";try{self["workbox:precaching:4.3.1"]&&_()}catch(t){}const o=[],i={get:()=>o,add(t){o.push(...t)}};const a="__WB_REVISION__";function r(t){if(!t)throw new c.WorkboxError("add-to-cache-list-unexpected-type",{entry:t});if("string"==typeof t){const e=new URL(t,location);return{cacheKey:e.href,url:e.href}}const{revision:e,url:n}=t;if(!n)throw new c.WorkboxError("add-to-cache-list-unexpected-type",{entry:t});if(!e){const t=new URL(n,location);return{cacheKey:t.href,url:t.href}}const s=new URL(n,location),o=new URL(n,location);return o.searchParams.set(a,e),{cacheKey:o.href,url:s.href}}class l{constructor(t){this.t=e.cacheNames.getPrecacheName(t),this.s=new Map}addToCacheList(t){for(const e of t){const{cacheKey:t,url:n}=r(e);if(this.s.has(n)&&this.s.get(n)!==t)throw new c.WorkboxError("add-to-cache-list-conflicting-entries",{firstEntry:this.s.get(n),secondEntry:t});this.s.set(n,t)}}async install({event:t,plugins:e}={}){const n=[],s=[],c=await caches.open(this.t),o=await c.keys(),i=new Set(o.map(t=>t.url));for(const t of this.s.values())i.has(t)?s.push(t):n.push(t);const a=n.map(n=>this.o({event:t,plugins:e,url:n}));return await Promise.all(a),{updatedURLs:n,notUpdatedURLs:s}}async activate(){const t=await caches.open(this.t),e=await t.keys(),n=new Set(this.s.values()),s=[];for(const c of e)n.has(c.url)||(await t.delete(c),s.push(c.url));return{deletedURLs:s}}async o({url:t,event:e,plugins:o}){const i=new Request(t,{credentials:"same-origin"});let a,r=await s.fetchWrapper.fetch({event:e,plugins:o,request:i,fetchOptions:{importance:"low"}});for(const t of o||[])"cacheWillUpdate"in t&&(a=t.cacheWillUpdate.bind(t));if(!(a?a({event:e,request:i,response:r}):r.status<400))throw new c.WorkboxError("bad-precaching-response",{url:t,status:r.status});r.redirected&&(r=await async function(t){const e=t.clone(),n="body"in e?Promise.resolve(e.body):e.blob(),s=await n;return new Response(s,{headers:e.headers,status:e.status,statusText:e.statusText})}(r)),await n.cacheWrapper.put({event:e,plugins:o,request:i,response:r,cacheName:this.t,matchOptions:{ignoreSearch:!0}})}getURLsToCacheKeys(){return this.s}getCachedURLs(){return[...this.s.keys()]}getCacheKeyForURL(t){const e=new URL(t,location);return this.s.get(e.href)}}let h;const u=()=>(h||(h=new l),h);const d=(t,e)=>{const n=u().getURLsToCacheKeys();for(const s of function*(t,{ignoreURLParametersMatching:e,directoryIndex:n,cleanURLs:s,urlManipulation:c}={}){const o=new URL(t,location);o.hash="",yield o.href;const i=function(t,e){for(const n of[...t.searchParams.keys()])e.some(t=>t.test(n))&&t.searchParams.delete(n);return t}(o,e);if(yield i.href,n&&i.pathname.endsWith("/")){const t=new URL(i);t.pathname+=n,yield t.href}if(s){const t=new URL(i);t.pathname+=".html",yield t.href}if(c){const t=c({url:o});for(const e of t)yield e.href}}(t,e)){const t=n.get(s);if(t)return t}};let w=!1;const f=t=>{w||((({ignoreURLParametersMatching:t=[/^utm_/],directoryIndex:n="index.html",cleanURLs:s=!0,urlManipulation:c=null}={})=>{const o=e.cacheNames.getPrecacheName();addEventListener("fetch",e=>{const i=d(e.request.url,{cleanURLs:s,directoryIndex:n,ignoreURLParametersMatching:t,urlManipulation:c});if(!i)return;let a=caches.open(o).then(t=>t.match(i)).then(t=>t||fetch(i));e.respondWith(a)})})(t),w=!0)},p=t=>{const e=u(),n=i.get();t.waitUntil(e.install({event:t,plugins:n}).catch(t=>{throw t}))},y=t=>{const e=u(),n=i.get();t.waitUntil(e.activate({event:t,plugins:n}))},L=t=>{u().addToCacheList(t),t.length>0&&(addEventListener("install",p),addEventListener("activate",y))};return t.PrecacheController=l,t.addPlugins=(t=>{i.add(t)}),t.addRoute=f,t.cleanupOutdatedCaches=(()=>{addEventListener("activate",t=>{const n=e.cacheNames.getPrecacheName();t.waitUntil((async(t,e="-precache-")=>{const n=(await caches.keys()).filter(n=>n.includes(e)&&n.includes(self.registration.scope)&&n!==t);return await Promise.all(n.map(t=>caches.delete(t))),n})(n).then(t=>{}))})}),t.getCacheKeyForURL=(t=>{return u().getCacheKeyForURL(t)}),t.precache=L,t.precacheAndRoute=((t,e)=>{L(t),f(e)}),t}({},workbox.core._private,workbox.core._private,workbox.core._private,workbox.core._private); - //# sourceMappingURL=workbox-precaching.prod.js.map -diff --git a/node_modules/workbox-precaching/build/workbox-precaching.prod.js.map b/node_modules/workbox-precaching/build/workbox-precaching.prod.js.map -index a67bd4a..3c977cf 100644 ---- a/node_modules/workbox-precaching/build/workbox-precaching.prod.js.map -+++ b/node_modules/workbox-precaching/build/workbox-precaching.prod.js.map -@@ -1 +1 @@ --{"version":3,"file":"workbox-precaching.prod.js","sources":["../_version.mjs","../utils/precachePlugins.mjs","../utils/createCacheKey.mjs","../PrecacheController.mjs","../utils/cleanRedirect.mjs","../utils/getOrCreatePrecacheController.mjs","../utils/getCacheKeyForURL.mjs","../utils/generateURLVariations.mjs","../utils/removeIgnoredSearchParams.mjs","../addRoute.mjs","../utils/addFetchListener.mjs","../precache.mjs","../addPlugins.mjs","../cleanupOutdatedCaches.mjs","../utils/deleteOutdatedCaches.mjs","../getCacheKeyForURL.mjs","../precacheAndRoute.mjs"],"sourcesContent":["try{self['workbox:precaching:4.3.1']&&_()}catch(e){}// eslint-disable-line","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n\nconst plugins = [];\n\nexport const precachePlugins = {\n /*\n * @return {Array}\n * @private\n */\n get() {\n return plugins;\n },\n\n /*\n * @param {Array} newPlugins\n * @private\n */\n add(newPlugins) {\n plugins.push(...newPlugins);\n },\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport '../_version.mjs';\n\n// Name of the search parameter used to store revision info.\nconst REVISION_SEARCH_PARAM = '__WB_REVISION__';\n\n/**\n * Converts a manifest entry into a versioned URL suitable for precaching.\n *\n * @param {Object} entry\n * @return {string} A URL with versioning info.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function createCacheKey(entry) {\n if (!entry) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If a precache manifest entry is a string, it's assumed to be a versioned\n // URL, like '/app.abcd1234.js'. Return as-is.\n if (typeof entry === 'string') {\n const urlObject = new URL(entry, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n const {revision, url} = entry;\n if (!url) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If there's just a URL and no revision, then it's also assumed to be a\n // versioned URL.\n if (!revision) {\n const urlObject = new URL(url, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n // Otherwise, construct a properly versioned URL using the custom Workbox\n // search parameter along with the revision info.\n const originalURL = new URL(url, location);\n const cacheKeyURL = new URL(url, location);\n cacheKeyURL.searchParams.set(REVISION_SEARCH_PARAM, revision);\n return {\n cacheKey: cacheKeyURL.href,\n url: originalURL.href,\n };\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {assert} from 'workbox-core/_private/assert.mjs';\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {cacheWrapper} from 'workbox-core/_private/cacheWrapper.mjs';\nimport {fetchWrapper} from 'workbox-core/_private/fetchWrapper.mjs';\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport {cleanRedirect} from './utils/cleanRedirect.mjs';\nimport {createCacheKey} from './utils/createCacheKey.mjs';\nimport {printCleanupDetails} from './utils/printCleanupDetails.mjs';\nimport {printInstallDetails} from './utils/printInstallDetails.mjs';\n\nimport './_version.mjs';\n\n\n/**\n * Performs efficient precaching of assets.\n *\n * @memberof module:workbox-precaching\n */\nclass PrecacheController {\n /**\n * Create a new PrecacheController.\n *\n * @param {string} [cacheName] An optional name for the cache, to override\n * the default precache name.\n */\n constructor(cacheName) {\n this._cacheName = cacheNames.getPrecacheName(cacheName);\n this._urlsToCacheKeys = new Map();\n }\n\n /**\n * This method will add items to the precache list, removing duplicates\n * and ensuring the information is valid.\n *\n * @param {\n * Array\n * } entries Array of entries to precache.\n */\n addToCacheList(entries) {\n if (process.env.NODE_ENV !== 'production') {\n assert.isArray(entries, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'addToCacheList',\n paramName: 'entries',\n });\n }\n\n for (const entry of entries) {\n const {cacheKey, url} = createCacheKey(entry);\n if (this._urlsToCacheKeys.has(url) &&\n this._urlsToCacheKeys.get(url) !== cacheKey) {\n throw new WorkboxError('add-to-cache-list-conflicting-entries', {\n firstEntry: this._urlsToCacheKeys.get(url),\n secondEntry: cacheKey,\n });\n }\n this._urlsToCacheKeys.set(url, cacheKey);\n }\n }\n\n /**\n * Precaches new and updated assets. Call this method from the service worker\n * install event.\n *\n * @param {Object} options\n * @param {Event} [options.event] The install event (if needed).\n * @param {Array} [options.plugins] Plugins to be used for fetching\n * and caching during install.\n * @return {Promise}\n */\n async install({event, plugins} = {}) {\n if (process.env.NODE_ENV !== 'production') {\n if (plugins) {\n assert.isArray(plugins, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'install',\n paramName: 'plugins',\n });\n }\n }\n\n const urlsToPrecache = [];\n const urlsAlreadyPrecached = [];\n\n const cache = await caches.open(this._cacheName);\n const alreadyCachedRequests = await cache.keys();\n const alreadyCachedURLs = new Set(alreadyCachedRequests.map(\n (request) => request.url));\n\n for (const cacheKey of this._urlsToCacheKeys.values()) {\n if (alreadyCachedURLs.has(cacheKey)) {\n urlsAlreadyPrecached.push(cacheKey);\n } else {\n urlsToPrecache.push(cacheKey);\n }\n }\n\n const precacheRequests = urlsToPrecache.map((url) => {\n return this._addURLToCache({event, plugins, url});\n });\n await Promise.all(precacheRequests);\n\n if (process.env.NODE_ENV !== 'production') {\n printInstallDetails(urlsToPrecache, urlsAlreadyPrecached);\n }\n\n return {\n updatedURLs: urlsToPrecache,\n notUpdatedURLs: urlsAlreadyPrecached,\n };\n }\n\n /**\n * Deletes assets that are no longer present in the current precache manifest.\n * Call this method from the service worker activate event.\n *\n * @return {Promise}\n */\n async activate() {\n const cache = await caches.open(this._cacheName);\n const currentlyCachedRequests = await cache.keys();\n const expectedCacheKeys = new Set(this._urlsToCacheKeys.values());\n\n const deletedURLs = [];\n for (const request of currentlyCachedRequests) {\n if (!expectedCacheKeys.has(request.url)) {\n await cache.delete(request);\n deletedURLs.push(request.url);\n }\n }\n\n if (process.env.NODE_ENV !== 'production') {\n printCleanupDetails(deletedURLs);\n }\n\n return {deletedURLs};\n }\n\n /**\n * Requests the entry and saves it to the cache if the response is valid.\n * By default, any response with a status code of less than 400 (including\n * opaque responses) is considered valid.\n *\n * If you need to use custom criteria to determine what's valid and what\n * isn't, then pass in an item in `options.plugins` that implements the\n * `cacheWillUpdate()` lifecycle event.\n *\n * @private\n * @param {Object} options\n * @param {string} options.url The URL to fetch and cache.\n * @param {Event} [options.event] The install event (if passed).\n * @param {Array} [options.plugins] An array of plugins to apply to\n * fetch and caching.\n */\n async _addURLToCache({url, event, plugins}) {\n const request = new Request(url, {credentials: 'same-origin'});\n let response = await fetchWrapper.fetch({\n event,\n plugins,\n request,\n });\n\n // Allow developers to override the default logic about what is and isn't\n // valid by passing in a plugin implementing cacheWillUpdate(), e.g.\n // a workbox.cacheableResponse.Plugin instance.\n let cacheWillUpdateCallback;\n for (const plugin of (plugins || [])) {\n if ('cacheWillUpdate' in plugin) {\n cacheWillUpdateCallback = plugin.cacheWillUpdate.bind(plugin);\n }\n }\n\n const isValidResponse = cacheWillUpdateCallback ?\n // Use a callback if provided. It returns a truthy value if valid.\n cacheWillUpdateCallback({event, request, response}) :\n // Otherwise, default to considering any response status under 400 valid.\n // This includes, by default, considering opaque responses valid.\n response.status < 400;\n\n // Consider this a failure, leading to the `install` handler failing, if\n // we get back an invalid response.\n if (!isValidResponse) {\n throw new WorkboxError('bad-precaching-response', {\n url,\n status: response.status,\n });\n }\n\n if (response.redirected) {\n response = await cleanRedirect(response);\n }\n\n await cacheWrapper.put({\n event,\n plugins,\n request,\n response,\n cacheName: this._cacheName,\n matchOptions: {\n ignoreSearch: true,\n },\n });\n }\n\n /**\n * Returns a mapping of a precached URL to the corresponding cache key, taking\n * into account the revision information for the URL.\n *\n * @return {Map} A URL to cache key mapping.\n */\n getURLsToCacheKeys() {\n return this._urlsToCacheKeys;\n }\n\n /**\n * Returns a list of all the URLs that have been precached by the current\n * service worker.\n *\n * @return {Array} The precached URLs.\n */\n getCachedURLs() {\n return [...this._urlsToCacheKeys.keys()];\n }\n\n /**\n * Returns the cache key used for storing a given URL. If that URL is\n * unversioned, like `/index.html', then the cache key will be the original\n * URL with a search parameter appended to it.\n *\n * @param {string} url A URL whose cache key you want to look up.\n * @return {string} The versioned URL that corresponds to a cache key\n * for the original URL, or undefined if that URL isn't precached.\n */\n getCacheKeyForURL(url) {\n const urlObject = new URL(url, location);\n return this._urlsToCacheKeys.get(urlObject.href);\n }\n}\n\nexport {PrecacheController};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * @param {Response} response\n * @return {Response}\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport async function cleanRedirect(response) {\n const clonedResponse = response.clone();\n\n // Not all browsers support the Response.body stream, so fall back\n // to reading the entire body into memory as a blob.\n const bodyPromise = 'body' in clonedResponse ?\n Promise.resolve(clonedResponse.body) :\n clonedResponse.blob();\n\n const body = await bodyPromise;\n\n // new Response() is happy when passed either a stream or a Blob.\n return new Response(body, {\n headers: clonedResponse.headers,\n status: clonedResponse.status,\n statusText: clonedResponse.statusText,\n });\n}\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {PrecacheController} from '../PrecacheController.mjs';\nimport '../_version.mjs';\n\n\nlet precacheController;\n\n/**\n * @return {PrecacheController}\n * @private\n */\nexport const getOrCreatePrecacheController = () => {\n if (!precacheController) {\n precacheController = new PrecacheController();\n }\n return precacheController;\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './getOrCreatePrecacheController.mjs';\nimport {generateURLVariations} from './generateURLVariations.mjs';\nimport '../_version.mjs';\n\n/**\n * This function will take the request URL and manipulate it based on the\n * configuration options.\n *\n * @param {string} url\n * @param {Object} options\n * @return {string} Returns the URL in the cache that matches the request,\n * if possible.\n *\n * @private\n */\nexport const getCacheKeyForURL = (url, options) => {\n const precacheController = getOrCreatePrecacheController();\n\n const urlsToCacheKeys = precacheController.getURLsToCacheKeys();\n for (const possibleURL of generateURLVariations(url, options)) {\n const possibleCacheKey = urlsToCacheKeys.get(possibleURL);\n if (possibleCacheKey) {\n return possibleCacheKey;\n }\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {removeIgnoredSearchParams} from './removeIgnoredSearchParams.mjs';\n\nimport '../_version.mjs';\n\n/**\n * Generator function that yields possible variations on the original URL to\n * check, one at a time.\n *\n * @param {string} url\n * @param {Object} options\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function* generateURLVariations(url, {\n ignoreURLParametersMatching,\n directoryIndex,\n cleanURLs,\n urlManipulation,\n} = {}) {\n const urlObject = new URL(url, location);\n urlObject.hash = '';\n yield urlObject.href;\n\n const urlWithoutIgnoredParams = removeIgnoredSearchParams(\n urlObject, ignoreURLParametersMatching);\n yield urlWithoutIgnoredParams.href;\n\n if (directoryIndex && urlWithoutIgnoredParams.pathname.endsWith('/')) {\n const directoryURL = new URL(urlWithoutIgnoredParams);\n directoryURL.pathname += directoryIndex;\n yield directoryURL.href;\n }\n\n if (cleanURLs) {\n const cleanURL = new URL(urlWithoutIgnoredParams);\n cleanURL.pathname += '.html';\n yield cleanURL.href;\n }\n\n if (urlManipulation) {\n const additionalURLs = urlManipulation({url: urlObject});\n for (const urlToAttempt of additionalURLs) {\n yield urlToAttempt.href;\n }\n }\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * Removes any URL search parameters that should be ignored.\n *\n * @param {URL} urlObject The original URL.\n * @param {Array} ignoreURLParametersMatching RegExps to test against\n * each search parameter name. Matches mean that the search parameter should be\n * ignored.\n * @return {URL} The URL with any ignored search parameters removed.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function removeIgnoredSearchParams(urlObject,\n ignoreURLParametersMatching) {\n // Convert the iterable into an array at the start of the loop to make sure\n // deletion doesn't mess up iteration.\n for (const paramName of [...urlObject.searchParams.keys()]) {\n if (ignoreURLParametersMatching.some((regExp) => regExp.test(paramName))) {\n urlObject.searchParams.delete(paramName);\n }\n }\n\n return urlObject;\n}\n","\n/*\n Copyright 2019 Google LLC\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addFetchListener} from './utils/addFetchListener.mjs';\nimport './_version.mjs';\n\n\nlet listenerAdded = false;\n\n/**\n * Add a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n *\n * @alias workbox.precaching.addRoute\n */\nexport const addRoute = (options) => {\n if (!listenerAdded) {\n addFetchListener(options);\n listenerAdded = true;\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {getFriendlyURL} from 'workbox-core/_private/getFriendlyURL.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getCacheKeyForURL} from './getCacheKeyForURL.mjs';\nimport '../_version.mjs';\n\n\n/**\n * Adds a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * NOTE: when called more than once this method will replace the previously set\n * configuration options. Calling it more than once is not recommended outside\n * of tests.\n *\n * @private\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n */\nexport const addFetchListener = ({\n ignoreURLParametersMatching = [/^utm_/],\n directoryIndex = 'index.html',\n cleanURLs = true,\n urlManipulation = null,\n} = {}) => {\n const cacheName = cacheNames.getPrecacheName();\n\n addEventListener('fetch', (event) => {\n const precachedURL = getCacheKeyForURL(event.request.url, {\n cleanURLs,\n directoryIndex,\n ignoreURLParametersMatching,\n urlManipulation,\n });\n if (!precachedURL) {\n if (process.env.NODE_ENV !== 'production') {\n logger.debug(`Precaching did not find a match for ` +\n getFriendlyURL(event.request.url));\n }\n return;\n }\n\n let responsePromise = caches.open(cacheName).then((cache) => {\n return cache.match(precachedURL);\n }).then((cachedResponse) => {\n if (cachedResponse) {\n return cachedResponse;\n }\n\n // Fall back to the network if we don't have a cached response\n // (perhaps due to manual cache cleanup).\n if (process.env.NODE_ENV !== 'production') {\n logger.warn(`The precached response for ` +\n `${getFriendlyURL(precachedURL)} in ${cacheName} was not found. ` +\n `Falling back to the network instead.`);\n }\n\n return fetch(precachedURL);\n });\n\n if (process.env.NODE_ENV !== 'production') {\n responsePromise = responsePromise.then((response) => {\n // Workbox is going to handle the route.\n // print the routing details to the console.\n logger.groupCollapsed(`Precaching is responding to: ` +\n getFriendlyURL(event.request.url));\n logger.log(`Serving the precached url: ${precachedURL}`);\n\n logger.groupCollapsed(`View request details here.`);\n logger.log(event.request);\n logger.groupEnd();\n\n logger.groupCollapsed(`View response details here.`);\n logger.log(response);\n logger.groupEnd();\n\n logger.groupEnd();\n return response;\n });\n }\n\n event.respondWith(responsePromise);\n });\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getOrCreatePrecacheController} from './utils/getOrCreatePrecacheController.mjs';\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\nconst installListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(\n precacheController.install({event, plugins})\n .catch((error) => {\n if (process.env.NODE_ENV !== 'production') {\n logger.error(`Service worker installation failed. It will ` +\n `be retried automatically during the next navigation.`);\n }\n // Re-throw the error to ensure installation fails.\n throw error;\n })\n );\n};\n\nconst activateListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(precacheController.activate({event, plugins}));\n};\n\n/**\n * Adds items to the precache list, removing any duplicates and\n * stores the files in the\n * [\"precache cache\"]{@link module:workbox-core.cacheNames} when the service\n * worker installs.\n *\n * This method can be called multiple times.\n *\n * Please note: This method **will not** serve any of the cached files for you.\n * It only precaches files. To respond to a network request you call\n * [addRoute()]{@link module:workbox-precaching.addRoute}.\n *\n * If you have a single array of files to precache, you can just call\n * [precacheAndRoute()]{@link module:workbox-precaching.precacheAndRoute}.\n *\n * @param {Array} entries Array of entries to precache.\n *\n * @alias workbox.precaching.precache\n */\nexport const precache = (entries) => {\n const precacheController = getOrCreatePrecacheController();\n precacheController.addToCacheList(entries);\n\n if (entries.length > 0) {\n // NOTE: these listeners will only be added once (even if the `precache()`\n // method is called multiple times) because event listeners are implemented\n // as a set, where each listener must be unique.\n addEventListener('install', installListener);\n addEventListener('activate', activateListener);\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds plugins to precaching.\n *\n * @param {Array} newPlugins\n *\n * @alias workbox.precaching.addPlugins\n */\nconst addPlugins = (newPlugins) => {\n precachePlugins.add(newPlugins);\n};\n\nexport {addPlugins};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {deleteOutdatedCaches} from './utils/deleteOutdatedCaches.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds an `activate` event listener which will clean up incompatible\n * precaches that were created by older versions of Workbox.\n *\n * @alias workbox.precaching.cleanupOutdatedCaches\n */\nexport const cleanupOutdatedCaches = () => {\n addEventListener('activate', (event) => {\n const cacheName = cacheNames.getPrecacheName();\n\n event.waitUntil(deleteOutdatedCaches(cacheName).then((cachesDeleted) => {\n if (process.env.NODE_ENV !== 'production') {\n if (cachesDeleted.length > 0) {\n logger.log(`The following out-of-date precaches were cleaned up ` +\n `automatically:`, cachesDeleted);\n }\n }\n }));\n });\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\nconst SUBSTRING_TO_FIND = '-precache-';\n\n/**\n * Cleans up incompatible precaches that were created by older versions of\n * Workbox, by a service worker registered under the current scope.\n *\n * This is meant to be called as part of the `activate` event.\n *\n * This should be safe to use as long as you don't include `substringToFind`\n * (defaulting to `-precache-`) in your non-precache cache names.\n *\n * @param {string} currentPrecacheName The cache name currently in use for\n * precaching. This cache won't be deleted.\n * @param {string} [substringToFind='-precache-'] Cache names which include this\n * substring will be deleted (excluding `currentPrecacheName`).\n * @return {Array} A list of all the cache names that were deleted.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nconst deleteOutdatedCaches = async (\n currentPrecacheName,\n substringToFind = SUBSTRING_TO_FIND) => {\n const cacheNames = await caches.keys();\n\n const cacheNamesToDelete = cacheNames.filter((cacheName) => {\n return cacheName.includes(substringToFind) &&\n cacheName.includes(self.registration.scope) &&\n cacheName !== currentPrecacheName;\n });\n\n await Promise.all(\n cacheNamesToDelete.map((cacheName) => caches.delete(cacheName)));\n\n return cacheNamesToDelete;\n};\n\nexport {deleteOutdatedCaches};\n\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './utils/getOrCreatePrecacheController.mjs';\nimport './_version.mjs';\n\n\n/**\n * Takes in a URL, and returns the corresponding URL that could be used to\n * lookup the entry in the precache.\n *\n * If a relative URL is provided, the location of the service worker file will\n * be used as the base.\n *\n * For precached entries without revision information, the cache key will be the\n * same as the original URL.\n *\n * For precached entries with revision information, the cache key will be the\n * original URL with the addition of a query parameter used for keeping track of\n * the revision info.\n *\n * @param {string} url The URL whose cache key to look up.\n * @return {string} The cache key that corresponds to that URL.\n *\n * @alias workbox.precaching.getCacheKeyForURL\n */\nexport const getCacheKeyForURL = (url) => {\n const precacheController = getOrCreatePrecacheController();\n return precacheController.getCacheKeyForURL(url);\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addRoute} from './addRoute.mjs';\nimport {precache} from './precache.mjs';\nimport './_version.mjs';\n\n\n/**\n * This method will add entries to the precache list and add a route to\n * respond to fetch events.\n *\n * This is a convenience method that will call\n * [precache()]{@link module:workbox-precaching.precache} and\n * [addRoute()]{@link module:workbox-precaching.addRoute} in a single call.\n *\n * @param {Array} entries Array of entries to precache.\n * @param {Object} options See\n * [addRoute() options]{@link module:workbox-precaching.addRoute}.\n *\n * @alias workbox.precaching.precacheAndRoute\n */\nexport const precacheAndRoute = (entries, options) => {\n precache(entries);\n addRoute(options);\n};\n"],"names":["self","_","e","plugins","precachePlugins","get","add","newPlugins","push","REVISION_SEARCH_PARAM","createCacheKey","entry","WorkboxError","urlObject","URL","location","cacheKey","href","url","revision","originalURL","cacheKeyURL","searchParams","set","PrecacheController","constructor","cacheName","_cacheName","cacheNames","getPrecacheName","_urlsToCacheKeys","Map","addToCacheList","entries","this","has","firstEntry","secondEntry","event","urlsToPrecache","urlsAlreadyPrecached","cache","caches","open","alreadyCachedRequests","keys","alreadyCachedURLs","Set","map","request","values","precacheRequests","_addURLToCache","Promise","all","updatedURLs","notUpdatedURLs","currentlyCachedRequests","expectedCacheKeys","deletedURLs","delete","Request","credentials","cacheWillUpdateCallback","response","fetchWrapper","fetch","plugin","cacheWillUpdate","bind","status","redirected","async","clonedResponse","clone","bodyPromise","resolve","body","blob","Response","headers","statusText","cleanRedirect","cacheWrapper","put","matchOptions","ignoreSearch","getURLsToCacheKeys","getCachedURLs","getCacheKeyForURL","precacheController","getOrCreatePrecacheController","options","urlsToCacheKeys","possibleURL","ignoreURLParametersMatching","directoryIndex","cleanURLs","urlManipulation","hash","urlWithoutIgnoredParams","paramName","some","regExp","test","removeIgnoredSearchParams","pathname","endsWith","directoryURL","cleanURL","additionalURLs","urlToAttempt","generateURLVariations","possibleCacheKey","listenerAdded","addRoute","addEventListener","precachedURL","responsePromise","then","match","cachedResponse","respondWith","addFetchListener","installListener","waitUntil","install","catch","error","activateListener","activate","precache","length","currentPrecacheName","substringToFind","cacheNamesToDelete","filter","includes","registration","scope","deleteOutdatedCaches","cachesDeleted"],"mappings":"uFAAA,IAAIA,KAAK,6BAA6BC,IAAI,MAAMC,ICWhD,MAAMC,EAAU,GAEHC,EAAkB,CAK7BC,IAAG,IACMF,EAOTG,IAAIC,GACFJ,EAAQK,QAAQD,KCdpB,MAAME,EAAwB,kBAWvB,SAASC,EAAeC,OACxBA,QACG,IAAIC,eAAa,oCAAqC,CAACD,MAAAA,OAK1C,iBAAVA,EAAoB,OACvBE,EAAY,IAAIC,IAAIH,EAAOI,gBAC1B,CACLC,SAAUH,EAAUI,KACpBC,IAAKL,EAAUI,YAIbE,SAACA,EAADD,IAAWA,GAAOP,MACnBO,QACG,IAAIN,eAAa,oCAAqC,CAACD,MAAAA,QAK1DQ,EAAU,OACPN,EAAY,IAAIC,IAAII,EAAKH,gBACxB,CACLC,SAAUH,EAAUI,KACpBC,IAAKL,EAAUI,YAMbG,EAAc,IAAIN,IAAII,EAAKH,UAC3BM,EAAc,IAAIP,IAAII,EAAKH,iBACjCM,EAAYC,aAAaC,IAAId,EAAuBU,GAC7C,CACLH,SAAUK,EAAYJ,KACtBC,IAAKE,EAAYH,MClCrB,MAAMO,EAOJC,YAAYC,QACLC,EAAaC,aAAWC,gBAAgBH,QACxCI,EAAmB,IAAIC,IAW9BC,eAAeC,OAUR,MAAMtB,KAASsB,EAAS,OACrBjB,SAACA,EAADE,IAAWA,GAAOR,EAAeC,MACnCuB,KAAKJ,EAAiBK,IAAIjB,IAC1BgB,KAAKJ,EAAiBzB,IAAIa,KAASF,QAC/B,IAAIJ,eAAa,wCAAyC,CAC9DwB,WAAYF,KAAKJ,EAAiBzB,IAAIa,GACtCmB,YAAarB,SAGZc,EAAiBP,IAAIL,EAAKF,mBAcrBsB,MAACA,EAADnC,QAAQA,GAAW,UAYzBoC,EAAiB,GACjBC,EAAuB,GAEvBC,QAAcC,OAAOC,KAAKT,KAAKP,GAC/BiB,QAA8BH,EAAMI,OACpCC,EAAoB,IAAIC,IAAIH,EAAsBI,IACnDC,GAAYA,EAAQ/B,UAEpB,MAAMF,KAAYkB,KAAKJ,EAAiBoB,SACvCJ,EAAkBX,IAAInB,GACxBwB,EAAqBhC,KAAKQ,GAE1BuB,EAAe/B,KAAKQ,SAIlBmC,EAAmBZ,EAAeS,IAAK9B,GACpCgB,KAAKkB,EAAe,CAACd,MAAAA,EAAOnC,QAAAA,EAASe,IAAAA,kBAExCmC,QAAQC,IAAIH,GAMX,CACLI,YAAahB,EACbiB,eAAgBhB,0BAWZC,QAAcC,OAAOC,KAAKT,KAAKP,GAC/B8B,QAAgChB,EAAMI,OACtCa,EAAoB,IAAIX,IAAIb,KAAKJ,EAAiBoB,UAElDS,EAAc,OACf,MAAMV,KAAWQ,EACfC,EAAkBvB,IAAIc,EAAQ/B,aAC3BuB,EAAMmB,OAAOX,GACnBU,EAAYnD,KAAKyC,EAAQ/B,YAQtB,CAACyC,YAAAA,YAmBWzC,IAACA,EAADoB,MAAMA,EAANnC,QAAaA,UAC1B8C,EAAU,IAAIY,QAAQ3C,EAAK,CAAC4C,YAAa,oBAU3CC,EATAC,QAAiBC,eAAaC,MAAM,CACtC5B,MAAAA,EACAnC,QAAAA,EACA8C,QAAAA,QAOG,MAAMkB,KAAWhE,GAAW,GAC3B,oBAAqBgE,IACvBJ,EAA0BI,EAAOC,gBAAgBC,KAAKF,SAIlCJ,EAEtBA,EAAwB,CAACzB,MAAAA,EAAOW,QAAAA,EAASe,SAAAA,IAGzCA,EAASM,OAAS,WAKZ,IAAI1D,eAAa,0BAA2B,CAChDM,IAAAA,EACAoD,OAAQN,EAASM,SAIjBN,EAASO,aACXP,QCvLCQ,eAA6BR,SAC5BS,EAAiBT,EAASU,QAI1BC,EAAc,SAAUF,EAC5BpB,QAAQuB,QAAQH,EAAeI,MAC/BJ,EAAeK,OAEXD,QAAaF,SAGZ,IAAII,SAASF,EAAM,CACxBG,QAASP,EAAeO,QACxBV,OAAQG,EAAeH,OACvBW,WAAYR,EAAeQ,aDwKRC,CAAclB,UAG3BmB,eAAaC,IAAI,CACrB9C,MAAAA,EACAnC,QAAAA,EACA8C,QAAAA,EACAe,SAAAA,EACAtC,UAAWQ,KAAKP,EAChB0D,aAAc,CACZC,cAAc,KAWpBC,4BACSrD,KAAKJ,EASd0D,sBACS,IAAItD,KAAKJ,EAAiBe,QAYnC4C,kBAAkBvE,SACVL,EAAY,IAAIC,IAAII,EAAKH,iBACxBmB,KAAKJ,EAAiBzB,IAAIQ,EAAUI,OE1O/C,IAAIyE,EAMG,MAAMC,EAAgC,KACtCD,IACHA,EAAqB,IAAIlE,GAEpBkE,GCEF,MAAMD,EAAoB,CAACvE,EAAK0E,WAG/BC,EAFqBF,IAEgBJ,yBACtC,MAAMO,KCNN,UAAgC5E,GAAK6E,4BAC1CA,EAD0CC,eAE1CA,EAF0CC,UAG1CA,EAH0CC,gBAI1CA,GACE,UACIrF,EAAY,IAAIC,IAAII,EAAKH,UAC/BF,EAAUsF,KAAO,SACXtF,EAAUI,WAEVmF,ECVD,SAAmCvF,EACtCkF,OAGG,MAAMM,IAAa,IAAIxF,EAAUS,aAAauB,QAC7CkD,EAA4BO,KAAMC,GAAWA,EAAOC,KAAKH,KAC3DxF,EAAUS,aAAasC,OAAOyC,UAI3BxF,EDAyB4F,CAC5B5F,EAAWkF,YACTK,EAAwBnF,KAE1B+E,GAAkBI,EAAwBM,SAASC,SAAS,KAAM,OAC9DC,EAAe,IAAI9F,IAAIsF,GAC7BQ,EAAaF,UAAYV,QACnBY,EAAa3F,QAGjBgF,EAAW,OACPY,EAAW,IAAI/F,IAAIsF,GACzBS,EAASH,UAAY,cACfG,EAAS5F,QAGbiF,EAAiB,OACbY,EAAiBZ,EAAgB,CAAChF,IAAKL,QACxC,MAAMkG,KAAgBD,QACnBC,EAAa9F,MDvBG+F,CAAsB9F,EAAK0E,GAAU,OACvDqB,EAAmBpB,EAAgBxF,IAAIyF,MACzCmB,SACKA,IGnBb,IAAIC,GAAgB,QA0BPC,EAAYvB,IAClBsB,ICGyB,GAC9BnB,4BAAAA,EAA8B,CAAC,SAC/BC,eAAAA,EAAiB,aACjBC,UAAAA,GAAY,EACZC,gBAAAA,EAAkB,MAChB,YACIxE,EAAYE,aAAWC,kBAE7BuF,iBAAiB,QAAU9E,UACnB+E,EAAe5B,EAAkBnD,EAAMW,QAAQ/B,IAAK,CACxD+E,UAAAA,EACAD,eAAAA,EACAD,4BAAAA,EACAG,gBAAAA,QAEGmB,aAQDC,EAAkB5E,OAAOC,KAAKjB,GAAW6F,KAAM9E,GAC1CA,EAAM+E,MAAMH,IAClBE,KAAME,GACHA,GAYGvD,MAAMmD,IAwBf/E,EAAMoF,YAAYJ,MDhElBK,CAAiB/B,GACjBsB,GAAgB,IE3BdU,EAAmBtF,UACjBoD,EAAqBC,IACrBxF,EAAUC,EAAgBC,MAEhCiC,EAAMuF,UACFnC,EAAmBoC,QAAQ,CAACxF,MAAAA,EAAOnC,QAAAA,IAC9B4H,MAAOC,UAMAA,MAKZC,EAAoB3F,UAClBoD,EAAqBC,IACrBxF,EAAUC,EAAgBC,MAEhCiC,EAAMuF,UAAUnC,EAAmBwC,SAAS,CAAC5F,MAAAA,EAAOnC,QAAAA,MAsBzCgI,EAAYlG,IACI0D,IACR3D,eAAeC,GAE9BA,EAAQmG,OAAS,IAInBhB,iBAAiB,UAAWQ,GAC5BR,iBAAiB,WAAYa,yBC/Cb1H,CAAAA,IAClBH,EAAgBE,IAAIC,0CCAe,MACnC6G,iBAAiB,WAAa9E,UACtBZ,EAAYE,aAAWC,kBAE7BS,EAAMuF,UCMmBrD,OAC3B6D,EACAC,EAtBwB,sBAyBlBC,SAFmB7F,OAAOG,QAEM2F,OAAQ9G,GACrCA,EAAU+G,SAASH,IACnB5G,EAAU+G,SAASzI,KAAK0I,aAAaC,QACrCjH,IAAc2G,gBAGjBhF,QAAQC,IACViF,EAAmBvF,IAAKtB,GAAcgB,OAAOkB,OAAOlC,KAEjD6G,GDpBWK,CAAqBlH,GAAW6F,KAAMsB,gCEQxB3H,CAAAA,WACLyE,IACDF,kBAAkBvE,qCCPd,EAACe,EAAS2D,KACxCuC,EAASlG,GACTkF,EAASvB"} -\ No newline at end of file -+{"version":3,"file":"workbox-precaching.prod.js","sources":["../_version.mjs","../utils/precachePlugins.mjs","../utils/createCacheKey.mjs","../PrecacheController.mjs","../utils/cleanRedirect.mjs","../utils/getOrCreatePrecacheController.mjs","../utils/getCacheKeyForURL.mjs","../utils/generateURLVariations.mjs","../utils/removeIgnoredSearchParams.mjs","../addRoute.mjs","../utils/addFetchListener.mjs","../precache.mjs","../addPlugins.mjs","../cleanupOutdatedCaches.mjs","../utils/deleteOutdatedCaches.mjs","../getCacheKeyForURL.mjs","../precacheAndRoute.mjs"],"sourcesContent":["try{self['workbox:precaching:4.3.1']&&_()}catch(e){}// eslint-disable-line","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n\nconst plugins = [];\n\nexport const precachePlugins = {\n /*\n * @return {Array}\n * @private\n */\n get() {\n return plugins;\n },\n\n /*\n * @param {Array} newPlugins\n * @private\n */\n add(newPlugins) {\n plugins.push(...newPlugins);\n },\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport '../_version.mjs';\n\n// Name of the search parameter used to store revision info.\nconst REVISION_SEARCH_PARAM = '__WB_REVISION__';\n\n/**\n * Converts a manifest entry into a versioned URL suitable for precaching.\n *\n * @param {Object} entry\n * @return {string} A URL with versioning info.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function createCacheKey(entry) {\n if (!entry) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If a precache manifest entry is a string, it's assumed to be a versioned\n // URL, like '/app.abcd1234.js'. Return as-is.\n if (typeof entry === 'string') {\n const urlObject = new URL(entry, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n const {revision, url} = entry;\n if (!url) {\n throw new WorkboxError('add-to-cache-list-unexpected-type', {entry});\n }\n\n // If there's just a URL and no revision, then it's also assumed to be a\n // versioned URL.\n if (!revision) {\n const urlObject = new URL(url, location);\n return {\n cacheKey: urlObject.href,\n url: urlObject.href,\n };\n }\n\n // Otherwise, construct a properly versioned URL using the custom Workbox\n // search parameter along with the revision info.\n const originalURL = new URL(url, location);\n const cacheKeyURL = new URL(url, location);\n cacheKeyURL.searchParams.set(REVISION_SEARCH_PARAM, revision);\n return {\n cacheKey: cacheKeyURL.href,\n url: originalURL.href,\n };\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {assert} from 'workbox-core/_private/assert.mjs';\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {cacheWrapper} from 'workbox-core/_private/cacheWrapper.mjs';\nimport {fetchWrapper} from 'workbox-core/_private/fetchWrapper.mjs';\nimport {WorkboxError} from 'workbox-core/_private/WorkboxError.mjs';\n\nimport {cleanRedirect} from './utils/cleanRedirect.mjs';\nimport {createCacheKey} from './utils/createCacheKey.mjs';\nimport {printCleanupDetails} from './utils/printCleanupDetails.mjs';\nimport {printInstallDetails} from './utils/printInstallDetails.mjs';\n\nimport './_version.mjs';\n\n\n/**\n * Performs efficient precaching of assets.\n *\n * @memberof module:workbox-precaching\n */\nclass PrecacheController {\n /**\n * Create a new PrecacheController.\n *\n * @param {string} [cacheName] An optional name for the cache, to override\n * the default precache name.\n */\n constructor(cacheName) {\n this._cacheName = cacheNames.getPrecacheName(cacheName);\n this._urlsToCacheKeys = new Map();\n }\n\n /**\n * This method will add items to the precache list, removing duplicates\n * and ensuring the information is valid.\n *\n * @param {\n * Array\n * } entries Array of entries to precache.\n */\n addToCacheList(entries) {\n if (process.env.NODE_ENV !== 'production') {\n assert.isArray(entries, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'addToCacheList',\n paramName: 'entries',\n });\n }\n\n for (const entry of entries) {\n const {cacheKey, url} = createCacheKey(entry);\n if (this._urlsToCacheKeys.has(url) &&\n this._urlsToCacheKeys.get(url) !== cacheKey) {\n throw new WorkboxError('add-to-cache-list-conflicting-entries', {\n firstEntry: this._urlsToCacheKeys.get(url),\n secondEntry: cacheKey,\n });\n }\n this._urlsToCacheKeys.set(url, cacheKey);\n }\n }\n\n /**\n * Precaches new and updated assets. Call this method from the service worker\n * install event.\n *\n * @param {Object} options\n * @param {Event} [options.event] The install event (if needed).\n * @param {Array} [options.plugins] Plugins to be used for fetching\n * and caching during install.\n * @return {Promise}\n */\n async install({event, plugins} = {}) {\n if (process.env.NODE_ENV !== 'production') {\n if (plugins) {\n assert.isArray(plugins, {\n moduleName: 'workbox-precaching',\n className: 'PrecacheController',\n funcName: 'install',\n paramName: 'plugins',\n });\n }\n }\n\n const urlsToPrecache = [];\n const urlsAlreadyPrecached = [];\n\n const cache = await caches.open(this._cacheName);\n const alreadyCachedRequests = await cache.keys();\n const alreadyCachedURLs = new Set(alreadyCachedRequests.map(\n (request) => request.url));\n\n for (const cacheKey of this._urlsToCacheKeys.values()) {\n if (alreadyCachedURLs.has(cacheKey)) {\n urlsAlreadyPrecached.push(cacheKey);\n } else {\n urlsToPrecache.push(cacheKey);\n }\n }\n\n const precacheRequests = urlsToPrecache.map((url) => {\n return this._addURLToCache({event, plugins, url});\n });\n await Promise.all(precacheRequests);\n\n if (process.env.NODE_ENV !== 'production') {\n printInstallDetails(urlsToPrecache, urlsAlreadyPrecached);\n }\n\n return {\n updatedURLs: urlsToPrecache,\n notUpdatedURLs: urlsAlreadyPrecached,\n };\n }\n\n /**\n * Deletes assets that are no longer present in the current precache manifest.\n * Call this method from the service worker activate event.\n *\n * @return {Promise}\n */\n async activate() {\n const cache = await caches.open(this._cacheName);\n const currentlyCachedRequests = await cache.keys();\n const expectedCacheKeys = new Set(this._urlsToCacheKeys.values());\n\n const deletedURLs = [];\n for (const request of currentlyCachedRequests) {\n if (!expectedCacheKeys.has(request.url)) {\n await cache.delete(request);\n deletedURLs.push(request.url);\n }\n }\n\n if (process.env.NODE_ENV !== 'production') {\n printCleanupDetails(deletedURLs);\n }\n\n return {deletedURLs};\n }\n\n /**\n * Requests the entry and saves it to the cache if the response is valid.\n * By default, any response with a status code of less than 400 (including\n * opaque responses) is considered valid.\n *\n * If you need to use custom criteria to determine what's valid and what\n * isn't, then pass in an item in `options.plugins` that implements the\n * `cacheWillUpdate()` lifecycle event.\n *\n * @private\n * @param {Object} options\n * @param {string} options.url The URL to fetch and cache.\n * @param {Event} [options.event] The install event (if passed).\n * @param {Array} [options.plugins] An array of plugins to apply to\n * fetch and caching.\n */\n async _addURLToCache({url, event, plugins}) {\n const request = new Request(url, {credentials: 'same-origin'});\n let response = await fetchWrapper.fetch({\n event,\n plugins,\n request,\n fetchOptions: { importance: 'low'},\n });\n\n // Allow developers to override the default logic about what is and isn't\n // valid by passing in a plugin implementing cacheWillUpdate(), e.g.\n // a workbox.cacheableResponse.Plugin instance.\n let cacheWillUpdateCallback;\n for (const plugin of (plugins || [])) {\n if ('cacheWillUpdate' in plugin) {\n cacheWillUpdateCallback = plugin.cacheWillUpdate.bind(plugin);\n }\n }\n\n const isValidResponse = cacheWillUpdateCallback ?\n // Use a callback if provided. It returns a truthy value if valid.\n cacheWillUpdateCallback({event, request, response}) :\n // Otherwise, default to considering any response status under 400 valid.\n // This includes, by default, considering opaque responses valid.\n response.status < 400;\n\n // Consider this a failure, leading to the `install` handler failing, if\n // we get back an invalid response.\n if (!isValidResponse) {\n throw new WorkboxError('bad-precaching-response', {\n url,\n status: response.status,\n });\n }\n\n if (response.redirected) {\n response = await cleanRedirect(response);\n }\n\n await cacheWrapper.put({\n event,\n plugins,\n request,\n response,\n cacheName: this._cacheName,\n matchOptions: {\n ignoreSearch: true,\n },\n });\n }\n\n /**\n * Returns a mapping of a precached URL to the corresponding cache key, taking\n * into account the revision information for the URL.\n *\n * @return {Map} A URL to cache key mapping.\n */\n getURLsToCacheKeys() {\n return this._urlsToCacheKeys;\n }\n\n /**\n * Returns a list of all the URLs that have been precached by the current\n * service worker.\n *\n * @return {Array} The precached URLs.\n */\n getCachedURLs() {\n return [...this._urlsToCacheKeys.keys()];\n }\n\n /**\n * Returns the cache key used for storing a given URL. If that URL is\n * unversioned, like `/index.html', then the cache key will be the original\n * URL with a search parameter appended to it.\n *\n * @param {string} url A URL whose cache key you want to look up.\n * @return {string} The versioned URL that corresponds to a cache key\n * for the original URL, or undefined if that URL isn't precached.\n */\n getCacheKeyForURL(url) {\n const urlObject = new URL(url, location);\n return this._urlsToCacheKeys.get(urlObject.href);\n }\n}\n\nexport {PrecacheController};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * @param {Response} response\n * @return {Response}\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport async function cleanRedirect(response) {\n const clonedResponse = response.clone();\n\n // Not all browsers support the Response.body stream, so fall back\n // to reading the entire body into memory as a blob.\n const bodyPromise = 'body' in clonedResponse ?\n Promise.resolve(clonedResponse.body) :\n clonedResponse.blob();\n\n const body = await bodyPromise;\n\n // new Response() is happy when passed either a stream or a Blob.\n return new Response(body, {\n headers: clonedResponse.headers,\n status: clonedResponse.status,\n statusText: clonedResponse.statusText,\n });\n}\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {PrecacheController} from '../PrecacheController.mjs';\nimport '../_version.mjs';\n\n\nlet precacheController;\n\n/**\n * @return {PrecacheController}\n * @private\n */\nexport const getOrCreatePrecacheController = () => {\n if (!precacheController) {\n precacheController = new PrecacheController();\n }\n return precacheController;\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './getOrCreatePrecacheController.mjs';\nimport {generateURLVariations} from './generateURLVariations.mjs';\nimport '../_version.mjs';\n\n/**\n * This function will take the request URL and manipulate it based on the\n * configuration options.\n *\n * @param {string} url\n * @param {Object} options\n * @return {string} Returns the URL in the cache that matches the request,\n * if possible.\n *\n * @private\n */\nexport const getCacheKeyForURL = (url, options) => {\n const precacheController = getOrCreatePrecacheController();\n\n const urlsToCacheKeys = precacheController.getURLsToCacheKeys();\n for (const possibleURL of generateURLVariations(url, options)) {\n const possibleCacheKey = urlsToCacheKeys.get(possibleURL);\n if (possibleCacheKey) {\n return possibleCacheKey;\n }\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {removeIgnoredSearchParams} from './removeIgnoredSearchParams.mjs';\n\nimport '../_version.mjs';\n\n/**\n * Generator function that yields possible variations on the original URL to\n * check, one at a time.\n *\n * @param {string} url\n * @param {Object} options\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function* generateURLVariations(url, {\n ignoreURLParametersMatching,\n directoryIndex,\n cleanURLs,\n urlManipulation,\n} = {}) {\n const urlObject = new URL(url, location);\n urlObject.hash = '';\n yield urlObject.href;\n\n const urlWithoutIgnoredParams = removeIgnoredSearchParams(\n urlObject, ignoreURLParametersMatching);\n yield urlWithoutIgnoredParams.href;\n\n if (directoryIndex && urlWithoutIgnoredParams.pathname.endsWith('/')) {\n const directoryURL = new URL(urlWithoutIgnoredParams);\n directoryURL.pathname += directoryIndex;\n yield directoryURL.href;\n }\n\n if (cleanURLs) {\n const cleanURL = new URL(urlWithoutIgnoredParams);\n cleanURL.pathname += '.html';\n yield cleanURL.href;\n }\n\n if (urlManipulation) {\n const additionalURLs = urlManipulation({url: urlObject});\n for (const urlToAttempt of additionalURLs) {\n yield urlToAttempt.href;\n }\n }\n}\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\n/**\n * Removes any URL search parameters that should be ignored.\n *\n * @param {URL} urlObject The original URL.\n * @param {Array} ignoreURLParametersMatching RegExps to test against\n * each search parameter name. Matches mean that the search parameter should be\n * ignored.\n * @return {URL} The URL with any ignored search parameters removed.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nexport function removeIgnoredSearchParams(urlObject,\n ignoreURLParametersMatching) {\n // Convert the iterable into an array at the start of the loop to make sure\n // deletion doesn't mess up iteration.\n for (const paramName of [...urlObject.searchParams.keys()]) {\n if (ignoreURLParametersMatching.some((regExp) => regExp.test(paramName))) {\n urlObject.searchParams.delete(paramName);\n }\n }\n\n return urlObject;\n}\n","\n/*\n Copyright 2019 Google LLC\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addFetchListener} from './utils/addFetchListener.mjs';\nimport './_version.mjs';\n\n\nlet listenerAdded = false;\n\n/**\n * Add a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n *\n * @alias workbox.precaching.addRoute\n */\nexport const addRoute = (options) => {\n if (!listenerAdded) {\n addFetchListener(options);\n listenerAdded = true;\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {getFriendlyURL} from 'workbox-core/_private/getFriendlyURL.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getCacheKeyForURL} from './getCacheKeyForURL.mjs';\nimport '../_version.mjs';\n\n\n/**\n * Adds a `fetch` listener to the service worker that will\n * respond to\n * [network requests]{@link https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers#Custom_responses_to_requests}\n * with precached assets.\n *\n * Requests for assets that aren't precached, the `FetchEvent` will not be\n * responded to, allowing the event to fall through to other `fetch` event\n * listeners.\n *\n * NOTE: when called more than once this method will replace the previously set\n * configuration options. Calling it more than once is not recommended outside\n * of tests.\n *\n * @private\n * @param {Object} options\n * @param {string} [options.directoryIndex=index.html] The `directoryIndex` will\n * check cache entries for a URLs ending with '/' to see if there is a hit when\n * appending the `directoryIndex` value.\n * @param {Array} [options.ignoreURLParametersMatching=[/^utm_/]] An\n * array of regex's to remove search params when looking for a cache match.\n * @param {boolean} [options.cleanURLs=true] The `cleanURLs` option will\n * check the cache for the URL with a `.html` added to the end of the end.\n * @param {workbox.precaching~urlManipulation} [options.urlManipulation]\n * This is a function that should take a URL and return an array of\n * alternative URL's that should be checked for precache matches.\n */\nexport const addFetchListener = ({\n ignoreURLParametersMatching = [/^utm_/],\n directoryIndex = 'index.html',\n cleanURLs = true,\n urlManipulation = null,\n} = {}) => {\n const cacheName = cacheNames.getPrecacheName();\n\n addEventListener('fetch', (event) => {\n const precachedURL = getCacheKeyForURL(event.request.url, {\n cleanURLs,\n directoryIndex,\n ignoreURLParametersMatching,\n urlManipulation,\n });\n if (!precachedURL) {\n if (process.env.NODE_ENV !== 'production') {\n logger.debug(`Precaching did not find a match for ` +\n getFriendlyURL(event.request.url));\n }\n return;\n }\n\n let responsePromise = caches.open(cacheName).then((cache) => {\n return cache.match(precachedURL);\n }).then((cachedResponse) => {\n if (cachedResponse) {\n return cachedResponse;\n }\n\n // Fall back to the network if we don't have a cached response\n // (perhaps due to manual cache cleanup).\n if (process.env.NODE_ENV !== 'production') {\n logger.warn(`The precached response for ` +\n `${getFriendlyURL(precachedURL)} in ${cacheName} was not found. ` +\n `Falling back to the network instead.`);\n }\n\n return fetch(precachedURL);\n });\n\n if (process.env.NODE_ENV !== 'production') {\n responsePromise = responsePromise.then((response) => {\n // Workbox is going to handle the route.\n // print the routing details to the console.\n logger.groupCollapsed(`Precaching is responding to: ` +\n getFriendlyURL(event.request.url));\n logger.log(`Serving the precached url: ${precachedURL}`);\n\n logger.groupCollapsed(`View request details here.`);\n logger.log(event.request);\n logger.groupEnd();\n\n logger.groupCollapsed(`View response details here.`);\n logger.log(response);\n logger.groupEnd();\n\n logger.groupEnd();\n return response;\n });\n }\n\n event.respondWith(responsePromise);\n });\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {getOrCreatePrecacheController} from './utils/getOrCreatePrecacheController.mjs';\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\nconst installListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(\n precacheController.install({event, plugins})\n .catch((error) => {\n if (process.env.NODE_ENV !== 'production') {\n logger.error(`Service worker installation failed. It will ` +\n `be retried automatically during the next navigation.`);\n }\n // Re-throw the error to ensure installation fails.\n throw error;\n })\n );\n};\n\nconst activateListener = (event) => {\n const precacheController = getOrCreatePrecacheController();\n const plugins = precachePlugins.get();\n\n event.waitUntil(precacheController.activate({event, plugins}));\n};\n\n/**\n * Adds items to the precache list, removing any duplicates and\n * stores the files in the\n * [\"precache cache\"]{@link module:workbox-core.cacheNames} when the service\n * worker installs.\n *\n * This method can be called multiple times.\n *\n * Please note: This method **will not** serve any of the cached files for you.\n * It only precaches files. To respond to a network request you call\n * [addRoute()]{@link module:workbox-precaching.addRoute}.\n *\n * If you have a single array of files to precache, you can just call\n * [precacheAndRoute()]{@link module:workbox-precaching.precacheAndRoute}.\n *\n * @param {Array} entries Array of entries to precache.\n *\n * @alias workbox.precaching.precache\n */\nexport const precache = (entries) => {\n const precacheController = getOrCreatePrecacheController();\n precacheController.addToCacheList(entries);\n\n if (entries.length > 0) {\n // NOTE: these listeners will only be added once (even if the `precache()`\n // method is called multiple times) because event listeners are implemented\n // as a set, where each listener must be unique.\n addEventListener('install', installListener);\n addEventListener('activate', activateListener);\n }\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {precachePlugins} from './utils/precachePlugins.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds plugins to precaching.\n *\n * @param {Array} newPlugins\n *\n * @alias workbox.precaching.addPlugins\n */\nconst addPlugins = (newPlugins) => {\n precachePlugins.add(newPlugins);\n};\n\nexport {addPlugins};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {cacheNames} from 'workbox-core/_private/cacheNames.mjs';\nimport {logger} from 'workbox-core/_private/logger.mjs';\nimport {deleteOutdatedCaches} from './utils/deleteOutdatedCaches.mjs';\nimport './_version.mjs';\n\n\n/**\n * Adds an `activate` event listener which will clean up incompatible\n * precaches that were created by older versions of Workbox.\n *\n * @alias workbox.precaching.cleanupOutdatedCaches\n */\nexport const cleanupOutdatedCaches = () => {\n addEventListener('activate', (event) => {\n const cacheName = cacheNames.getPrecacheName();\n\n event.waitUntil(deleteOutdatedCaches(cacheName).then((cachesDeleted) => {\n if (process.env.NODE_ENV !== 'production') {\n if (cachesDeleted.length > 0) {\n logger.log(`The following out-of-date precaches were cleaned up ` +\n `automatically:`, cachesDeleted);\n }\n }\n }));\n });\n};\n","/*\n Copyright 2018 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport '../_version.mjs';\n\nconst SUBSTRING_TO_FIND = '-precache-';\n\n/**\n * Cleans up incompatible precaches that were created by older versions of\n * Workbox, by a service worker registered under the current scope.\n *\n * This is meant to be called as part of the `activate` event.\n *\n * This should be safe to use as long as you don't include `substringToFind`\n * (defaulting to `-precache-`) in your non-precache cache names.\n *\n * @param {string} currentPrecacheName The cache name currently in use for\n * precaching. This cache won't be deleted.\n * @param {string} [substringToFind='-precache-'] Cache names which include this\n * substring will be deleted (excluding `currentPrecacheName`).\n * @return {Array} A list of all the cache names that were deleted.\n *\n * @private\n * @memberof module:workbox-precaching\n */\nconst deleteOutdatedCaches = async (\n currentPrecacheName,\n substringToFind = SUBSTRING_TO_FIND) => {\n const cacheNames = await caches.keys();\n\n const cacheNamesToDelete = cacheNames.filter((cacheName) => {\n return cacheName.includes(substringToFind) &&\n cacheName.includes(self.registration.scope) &&\n cacheName !== currentPrecacheName;\n });\n\n await Promise.all(\n cacheNamesToDelete.map((cacheName) => caches.delete(cacheName)));\n\n return cacheNamesToDelete;\n};\n\nexport {deleteOutdatedCaches};\n\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {getOrCreatePrecacheController}\n from './utils/getOrCreatePrecacheController.mjs';\nimport './_version.mjs';\n\n\n/**\n * Takes in a URL, and returns the corresponding URL that could be used to\n * lookup the entry in the precache.\n *\n * If a relative URL is provided, the location of the service worker file will\n * be used as the base.\n *\n * For precached entries without revision information, the cache key will be the\n * same as the original URL.\n *\n * For precached entries with revision information, the cache key will be the\n * original URL with the addition of a query parameter used for keeping track of\n * the revision info.\n *\n * @param {string} url The URL whose cache key to look up.\n * @return {string} The cache key that corresponds to that URL.\n *\n * @alias workbox.precaching.getCacheKeyForURL\n */\nexport const getCacheKeyForURL = (url) => {\n const precacheController = getOrCreatePrecacheController();\n return precacheController.getCacheKeyForURL(url);\n};\n","/*\n Copyright 2019 Google LLC\n\n Use of this source code is governed by an MIT-style\n license that can be found in the LICENSE file or at\n https://opensource.org/licenses/MIT.\n*/\n\nimport {addRoute} from './addRoute.mjs';\nimport {precache} from './precache.mjs';\nimport './_version.mjs';\n\n\n/**\n * This method will add entries to the precache list and add a route to\n * respond to fetch events.\n *\n * This is a convenience method that will call\n * [precache()]{@link module:workbox-precaching.precache} and\n * [addRoute()]{@link module:workbox-precaching.addRoute} in a single call.\n *\n * @param {Array} entries Array of entries to precache.\n * @param {Object} options See\n * [addRoute() options]{@link module:workbox-precaching.addRoute}.\n *\n * @alias workbox.precaching.precacheAndRoute\n */\nexport const precacheAndRoute = (entries, options) => {\n precache(entries);\n addRoute(options);\n};\n"],"names":["self","_","e","plugins","precachePlugins","get","add","newPlugins","push","REVISION_SEARCH_PARAM","createCacheKey","entry","WorkboxError","urlObject","URL","location","cacheKey","href","url","revision","originalURL","cacheKeyURL","searchParams","set","PrecacheController","constructor","cacheName","_cacheName","cacheNames","getPrecacheName","_urlsToCacheKeys","Map","addToCacheList","entries","this","has","firstEntry","secondEntry","event","urlsToPrecache","urlsAlreadyPrecached","cache","caches","open","alreadyCachedRequests","keys","alreadyCachedURLs","Set","map","request","values","precacheRequests","_addURLToCache","Promise","all","updatedURLs","notUpdatedURLs","currentlyCachedRequests","expectedCacheKeys","deletedURLs","delete","Request","credentials","cacheWillUpdateCallback","response","fetchWrapper","fetch","fetchOptions","importance","plugin","cacheWillUpdate","bind","status","redirected","async","clonedResponse","clone","bodyPromise","resolve","body","blob","Response","headers","statusText","cleanRedirect","cacheWrapper","put","matchOptions","ignoreSearch","getURLsToCacheKeys","getCachedURLs","getCacheKeyForURL","precacheController","getOrCreatePrecacheController","options","urlsToCacheKeys","possibleURL","ignoreURLParametersMatching","directoryIndex","cleanURLs","urlManipulation","hash","urlWithoutIgnoredParams","paramName","some","regExp","test","removeIgnoredSearchParams","pathname","endsWith","directoryURL","cleanURL","additionalURLs","urlToAttempt","generateURLVariations","possibleCacheKey","listenerAdded","addRoute","addEventListener","precachedURL","responsePromise","then","match","cachedResponse","respondWith","addFetchListener","installListener","waitUntil","install","catch","error","activateListener","activate","precache","length","currentPrecacheName","substringToFind","cacheNamesToDelete","filter","includes","registration","scope","deleteOutdatedCaches","cachesDeleted"],"mappings":"uFAAA,IAAIA,KAAK,6BAA6BC,IAAI,MAAMC,ICWhD,MAAMC,EAAU,GAEHC,EAAkB,CAK7BC,IAAG,IACMF,EAOTG,IAAIC,GACFJ,EAAQK,QAAQD,KCdpB,MAAME,EAAwB,kBAWvB,SAASC,EAAeC,OACxBA,QACG,IAAIC,eAAa,oCAAqC,CAACD,MAAAA,OAK1C,iBAAVA,EAAoB,OACvBE,EAAY,IAAIC,IAAIH,EAAOI,gBAC1B,CACLC,SAAUH,EAAUI,KACpBC,IAAKL,EAAUI,YAIbE,SAACA,EAADD,IAAWA,GAAOP,MACnBO,QACG,IAAIN,eAAa,oCAAqC,CAACD,MAAAA,QAK1DQ,EAAU,OACPN,EAAY,IAAIC,IAAII,EAAKH,gBACxB,CACLC,SAAUH,EAAUI,KACpBC,IAAKL,EAAUI,YAMbG,EAAc,IAAIN,IAAII,EAAKH,UAC3BM,EAAc,IAAIP,IAAII,EAAKH,iBACjCM,EAAYC,aAAaC,IAAId,EAAuBU,GAC7C,CACLH,SAAUK,EAAYJ,KACtBC,IAAKE,EAAYH,MClCrB,MAAMO,EAOJC,YAAYC,QACLC,EAAaC,aAAWC,gBAAgBH,QACxCI,EAAmB,IAAIC,IAW9BC,eAAeC,OAUR,MAAMtB,KAASsB,EAAS,OACrBjB,SAACA,EAADE,IAAWA,GAAOR,EAAeC,MACnCuB,KAAKJ,EAAiBK,IAAIjB,IAC1BgB,KAAKJ,EAAiBzB,IAAIa,KAASF,QAC/B,IAAIJ,eAAa,wCAAyC,CAC9DwB,WAAYF,KAAKJ,EAAiBzB,IAAIa,GACtCmB,YAAarB,SAGZc,EAAiBP,IAAIL,EAAKF,mBAcrBsB,MAACA,EAADnC,QAAQA,GAAW,UAYzBoC,EAAiB,GACjBC,EAAuB,GAEvBC,QAAcC,OAAOC,KAAKT,KAAKP,GAC/BiB,QAA8BH,EAAMI,OACpCC,EAAoB,IAAIC,IAAIH,EAAsBI,IACnDC,GAAYA,EAAQ/B,UAEpB,MAAMF,KAAYkB,KAAKJ,EAAiBoB,SACvCJ,EAAkBX,IAAInB,GACxBwB,EAAqBhC,KAAKQ,GAE1BuB,EAAe/B,KAAKQ,SAIlBmC,EAAmBZ,EAAeS,IAAK9B,GACpCgB,KAAKkB,EAAe,CAACd,MAAAA,EAAOnC,QAAAA,EAASe,IAAAA,kBAExCmC,QAAQC,IAAIH,GAMX,CACLI,YAAahB,EACbiB,eAAgBhB,0BAWZC,QAAcC,OAAOC,KAAKT,KAAKP,GAC/B8B,QAAgChB,EAAMI,OACtCa,EAAoB,IAAIX,IAAIb,KAAKJ,EAAiBoB,UAElDS,EAAc,OACf,MAAMV,KAAWQ,EACfC,EAAkBvB,IAAIc,EAAQ/B,aAC3BuB,EAAMmB,OAAOX,GACnBU,EAAYnD,KAAKyC,EAAQ/B,YAQtB,CAACyC,YAAAA,YAmBWzC,IAACA,EAADoB,MAAMA,EAANnC,QAAaA,UAC1B8C,EAAU,IAAIY,QAAQ3C,EAAK,CAAC4C,YAAa,oBAW3CC,EAVAC,QAAiBC,eAAaC,MAAM,CACtC5B,MAAAA,EACAnC,QAAAA,EACA8C,QAAAA,EACAkB,aAAc,CAAEC,WAAY,aAOzB,MAAMC,KAAWlE,GAAW,GAC3B,oBAAqBkE,IACvBN,EAA0BM,EAAOC,gBAAgBC,KAAKF,SAIlCN,EAEtBA,EAAwB,CAACzB,MAAAA,EAAOW,QAAAA,EAASe,SAAAA,IAGzCA,EAASQ,OAAS,WAKZ,IAAI5D,eAAa,0BAA2B,CAChDM,IAAAA,EACAsD,OAAQR,EAASQ,SAIjBR,EAASS,aACXT,QCxLCU,eAA6BV,SAC5BW,EAAiBX,EAASY,QAI1BC,EAAc,SAAUF,EAC5BtB,QAAQyB,QAAQH,EAAeI,MAC/BJ,EAAeK,OAEXD,QAAaF,SAGZ,IAAII,SAASF,EAAM,CACxBG,QAASP,EAAeO,QACxBV,OAAQG,EAAeH,OACvBW,WAAYR,EAAeQ,aDyKRC,CAAcpB,UAG3BqB,eAAaC,IAAI,CACrBhD,MAAAA,EACAnC,QAAAA,EACA8C,QAAAA,EACAe,SAAAA,EACAtC,UAAWQ,KAAKP,EAChB4D,aAAc,CACZC,cAAc,KAWpBC,4BACSvD,KAAKJ,EASd4D,sBACS,IAAIxD,KAAKJ,EAAiBe,QAYnC8C,kBAAkBzE,SACVL,EAAY,IAAIC,IAAII,EAAKH,iBACxBmB,KAAKJ,EAAiBzB,IAAIQ,EAAUI,OE3O/C,IAAI2E,EAMG,MAAMC,EAAgC,KACtCD,IACHA,EAAqB,IAAIpE,GAEpBoE,GCEF,MAAMD,EAAoB,CAACzE,EAAK4E,WAG/BC,EAFqBF,IAEgBJ,yBACtC,MAAMO,KCNN,UAAgC9E,GAAK+E,4BAC1CA,EAD0CC,eAE1CA,EAF0CC,UAG1CA,EAH0CC,gBAI1CA,GACE,UACIvF,EAAY,IAAIC,IAAII,EAAKH,UAC/BF,EAAUwF,KAAO,SACXxF,EAAUI,WAEVqF,ECVD,SAAmCzF,EACtCoF,OAGG,MAAMM,IAAa,IAAI1F,EAAUS,aAAauB,QAC7CoD,EAA4BO,KAAMC,GAAWA,EAAOC,KAAKH,KAC3D1F,EAAUS,aAAasC,OAAO2C,UAI3B1F,EDAyB8F,CAC5B9F,EAAWoF,YACTK,EAAwBrF,KAE1BiF,GAAkBI,EAAwBM,SAASC,SAAS,KAAM,OAC9DC,EAAe,IAAIhG,IAAIwF,GAC7BQ,EAAaF,UAAYV,QACnBY,EAAa7F,QAGjBkF,EAAW,OACPY,EAAW,IAAIjG,IAAIwF,GACzBS,EAASH,UAAY,cACfG,EAAS9F,QAGbmF,EAAiB,OACbY,EAAiBZ,EAAgB,CAAClF,IAAKL,QACxC,MAAMoG,KAAgBD,QACnBC,EAAahG,MDvBGiG,CAAsBhG,EAAK4E,GAAU,OACvDqB,EAAmBpB,EAAgB1F,IAAI2F,MACzCmB,SACKA,IGnBb,IAAIC,GAAgB,QA0BPC,EAAYvB,IAClBsB,ICGyB,GAC9BnB,4BAAAA,EAA8B,CAAC,SAC/BC,eAAAA,EAAiB,aACjBC,UAAAA,GAAY,EACZC,gBAAAA,EAAkB,MAChB,YACI1E,EAAYE,aAAWC,kBAE7ByF,iBAAiB,QAAUhF,UACnBiF,EAAe5B,EAAkBrD,EAAMW,QAAQ/B,IAAK,CACxDiF,UAAAA,EACAD,eAAAA,EACAD,4BAAAA,EACAG,gBAAAA,QAEGmB,aAQDC,EAAkB9E,OAAOC,KAAKjB,GAAW+F,KAAMhF,GAC1CA,EAAMiF,MAAMH,IAClBE,KAAME,GACHA,GAYGzD,MAAMqD,IAwBfjF,EAAMsF,YAAYJ,MDhElBK,CAAiB/B,GACjBsB,GAAgB,IE3BdU,EAAmBxF,UACjBsD,EAAqBC,IACrB1F,EAAUC,EAAgBC,MAEhCiC,EAAMyF,UACFnC,EAAmBoC,QAAQ,CAAC1F,MAAAA,EAAOnC,QAAAA,IAC9B8H,MAAOC,UAMAA,MAKZC,EAAoB7F,UAClBsD,EAAqBC,IACrB1F,EAAUC,EAAgBC,MAEhCiC,EAAMyF,UAAUnC,EAAmBwC,SAAS,CAAC9F,MAAAA,EAAOnC,QAAAA,MAsBzCkI,EAAYpG,IACI4D,IACR7D,eAAeC,GAE9BA,EAAQqG,OAAS,IAInBhB,iBAAiB,UAAWQ,GAC5BR,iBAAiB,WAAYa,gDC/Cb5H,CAAAA,IAClBH,EAAgBE,IAAIC,0CCAe,MACnC+G,iBAAiB,WAAahF,UACtBZ,EAAYE,aAAWC,kBAE7BS,EAAMyF,UCMmBrD,OAC3B6D,EACAC,EAtBwB,sBAyBlBC,SAFmB/F,OAAOG,QAEM6F,OAAQhH,GACrCA,EAAUiH,SAASH,IACnB9G,EAAUiH,SAAS3I,KAAK4I,aAAaC,QACrCnH,IAAc6G,gBAGjBlF,QAAQC,IACVmF,EAAmBzF,IAAKtB,GAAcgB,OAAOkB,OAAOlC,KAEjD+G,GDpBWK,CAAqBpH,GAAW+F,KAAMsB,gCEQxB7H,CAAAA,WACL2E,IACDF,kBAAkBzE,qCCPd,EAACe,EAAS6D,KACxCuC,EAASpG,GACToF,EAASvB"} -\ No newline at end of file diff --git a/packages/pwa/src/service-worker.ts b/packages/pwa/src/service-worker.ts deleted file mode 100644 index 3f8b0cb3d..000000000 --- a/packages/pwa/src/service-worker.ts +++ /dev/null @@ -1,12 +0,0 @@ -import * as workbox from 'workbox-build'; - -export const buildServiceWorker = async (dest: string) => { - await workbox.generateSW({ - globDirectory: dest, - globPatterns: ['**/*.{js,json,css,html,ttf,woff,woff2,eot,webp}'], - importWorkboxFrom: 'local', - swDest: `${dest}/service-worker.js`, - skipWaiting: true, - clientsClaim: true - }); -}; diff --git a/packages/pwa/tsconfig.json b/packages/pwa/tsconfig.json deleted file mode 100644 index f1a0e1870..000000000 --- a/packages/pwa/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./" - }, - "include": ["src"], - "exclude": ["node_modules", "dist", "reports", "src/__tests__"] -} diff --git a/packages/router/package.json b/packages/router/package.json deleted file mode 100644 index e2677f7b4..000000000 --- a/packages/router/package.json +++ /dev/null @@ -1,63 +0,0 @@ -{ - "name": "@blog/router", - "version": "6.26.198", - "description": "routing generator tools", - "author": "aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog#readme", - "license": "MIT", - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "private": true, - "main": "dist/index.js", - "types": "dist/index.d.ts", - "files": [ - "dist" - ], - "scripts": { - "clean": "rimraf dist", - "test": "jest", - "build:lib": "tsc" - }, - "dependencies": { - "@blog/article": "^6.26.198", - "@blog/common": "^6.26.198", - "@blog/config": "^6.26.198", - "date-fns": "2.19.0", - "lodash": "4.17.21", - "path-to-regexp": "6.2.0", - "sitemap": "5.1.0", - "uslug": "1.0.4" - }, - "devDependencies": { - "@types/fancy-log": "1.3.1", - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/node": "13.13.45", - "jest": "26.6.3", - "jest-raw-loader": "1.0.1", - "ts-jest": "26.5.3", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/packages/router/src/__tests__/match.util.test.ts b/packages/router/src/__tests__/match.util.test.ts deleted file mode 100644 index a66f04962..000000000 --- a/packages/router/src/__tests__/match.util.test.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { compile } from 'path-to-regexp'; -import { RoutesPathRegex } from '@blog/common/interfaces/routes'; - -describe('match.util', () => { - it('# should build post detail path without any extra unused properties', () => { - const toPath = compile(RoutesPathRegex.POST_DETAIL); - - const detailPath = toPath({ - year: '2019', - month: '12', - date: '09', - id: 'awesome-post' - }); - - const expectedPath = '/posts/2019/12/09/awesome-post'; - expect(detailPath).toEqual(expectedPath); - }); - - it('# should build post detail path with extra unused properties', () => { - const toPath = compile(RoutesPathRegex.POST_DETAIL); - - const detailPath = toPath({ - year: '2019', - month: '12', - date: '09', - id: 'awesome-post', - tag: 'useless-tag-args' - }); - - const expectedPath = '/posts/2019/12/09/awesome-post'; - expect(detailPath).toEqual(expectedPath); - }); -}); diff --git a/packages/router/src/category.route.util.ts b/packages/router/src/category.route.util.ts deleted file mode 100644 index b9eb8469e..000000000 --- a/packages/router/src/category.route.util.ts +++ /dev/null @@ -1,36 +0,0 @@ -import * as _ from 'lodash'; -import * as uslug from 'uslug'; -import { Meta, MetaName, RoutePathPrefix } from '@blog/common/interfaces/routes'; -export const createCategoriesOverviewRouteItem = () => ({ - id: RoutePathPrefix.CATEGORIES, - label: `Categories` // TODO: add i18n support -}); - -export const createCategoryDetailRouteItem = (rawCategory: string) => ({ - id: uslug(rawCategory), - label: `Category: ${rawCategory}` -}); - -export const createCategoriesOverviewDescMeta = (): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Categories` -}); - -export const createCategoryDetailDescMeta = (rawCategory: string): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Category: ${rawCategory}` -}); - -export const createCategoryDetailOpenGraphMetas = (rawCategory: string): Meta[] => [ - { - name: MetaName.OPEN_GRAPH_DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Category: ${rawCategory}` - } -]; - -export const createCategoriesOverviewMetas = (): Meta[] => [createCategoriesOverviewDescMeta()]; -export const createCategoryDetailMetas = (rawCategory: string): Meta[] => - _.concat(createCategoryDetailOpenGraphMetas(rawCategory), [createCategoryDetailDescMeta(rawCategory)]); diff --git a/packages/router/src/home.route.util.ts b/packages/router/src/home.route.util.ts deleted file mode 100644 index 8c013c643..000000000 --- a/packages/router/src/home.route.util.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { Meta, MetaName, RoutePathPrefix } from '@blog/common/interfaces/routes'; - -export const createHomeRouteItem = () => ({ - id: RoutePathPrefix.HOME, - label: 'Home' -}); - -export const createHomeDescMeta = (): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Home` -}); - -export const createHomeMetas = (): Meta[] => [createHomeDescMeta()]; diff --git a/packages/router/src/index.ts b/packages/router/src/index.ts deleted file mode 100644 index be07d1084..000000000 --- a/packages/router/src/index.ts +++ /dev/null @@ -1,285 +0,0 @@ -// context collection processing -import * as _ from 'lodash'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { Meta, MetaName, MetaValue, RouteMeta, RoutePathPrefix } from '@blog/common/interfaces/routes'; -import { Layout } from '@blog/common/interfaces/routes/layout'; -import { buildFullURL, buildURLPath } from '@blog/common/utils/path.util'; -import { format } from 'date-fns'; -import { buildTitle } from './title.util'; -import { - createTagDetailMetas, - createTagDetailRouteItem, - createTagsOverviewMetas, - createTagsOverviewRouteItem -} from './tag.route.util'; -import { - createCategoriesOverviewMetas, - createCategoriesOverviewRouteItem, - createCategoryDetailMetas, - createCategoryDetailRouteItem -} from './category.route.util'; -import { - createPageDetailMetas, - createPostDetailMetas, - createPostsOverviewMetas, - createPostsOverviewRouteItem -} from './post.route.util'; -import { - createBreadcrumbList, - createCategoriesOverviewBreadcrumbItem, - createCategoryDetailBreadcrumbItem, - createHomeBreadcrumbItem, - createPageDetailBreadcrumbItem, - createPostDetailBreadcrumbItem, - createPostsOverviewBreadcrumbItem, - createTagDetailBreadcrumbItem, - createTagsOverviewBreadcrumbItem -} from './breadcrumb.util'; -import { createHomeMetas } from './home.route.util'; -import { loadConfig } from '@blog/config'; - -export * from './home.route.util'; -export * from './tag.route.util'; -export * from './category.route.util'; -export * from './post.route.util'; -export * from './sitemap.util'; - -export interface RoutesOptions { - baseUrl: string; - baseTitle: string; - titleSeparator: string; -} - -export const createCommonMetas = (options: Partial): Meta[] => [ - { - name: MetaName.OPEN_GRAPH_SITE_NAME, - content: options.baseTitle - }, - { - name: MetaName.OPEN_GRAPH_TYPE, - content: MetaValue.WEBSITE - } -]; - -export const createGoogleAnalyticsMeta = (): Meta[] => { - const config = loadConfig(); - return config.site.googleAnalytics - ? [ - { - name: MetaName.GOOGLE_SITE_VERIFICATION, - content: config.site.googleAnalytics.verification - }, - { - id: MetaName.GOOGLE_SITE_TRACKING, - name: MetaName.GOOGLE_SITE_TRACKING, - content: config.site.googleAnalytics.tracking - } - ] - : []; -}; - -export const createTagsOverviewRouteMeta = ( - contexts: ArticleContext[], - options?: Partial -): RouteMeta => { - const tagsOverviewRouteItem = createTagsOverviewRouteItem(); - const path = buildURLPath(RoutePathPrefix.TAGS); - const title = buildTitle(tagsOverviewRouteItem.label, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: path, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createTagsOverviewBreadcrumbItem(options.baseUrl, tagsOverviewRouteItem.label, path) - ]), - type: Layout.TABLE, - metas: _.concat(createGoogleAnalyticsMeta(), createTagsOverviewMetas(), createCommonMetas(options)), - data: undefined - }; -}; -export const createCategoriesOverviewRouteMeta = ( - contexts: ArticleContext[], - options?: Partial -): RouteMeta => { - const categoriesOverviewRouteItem = createCategoriesOverviewRouteItem(); - const path = buildURLPath(RoutePathPrefix.CATEGORIES); - const title = buildTitle(categoriesOverviewRouteItem.label, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: path, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createCategoriesOverviewBreadcrumbItem(options.baseUrl, categoriesOverviewRouteItem.label, path) - ]), - type: Layout.TABLE, - metas: _.concat(createGoogleAnalyticsMeta(), createCategoriesOverviewMetas(), createCommonMetas(options)), - data: undefined - }; -}; -export const createPostsOverviewRouteMeta = ( - contexts: ArticleContext[], - options?: Partial -): RouteMeta => { - const postsOverviewRouteItem = createPostsOverviewRouteItem(); - const path = buildURLPath(RoutePathPrefix.POSTS); - const title = buildTitle(postsOverviewRouteItem.label, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: path, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createPostsOverviewBreadcrumbItem(options.baseUrl, postsOverviewRouteItem.label, path) - ]), - type: Layout.LIST, - metas: _.concat(createGoogleAnalyticsMeta(), createPostsOverviewMetas(), createCommonMetas(options)), - data: undefined - }; -}; - -export const createHomeRouteMeta = (options?: Partial): RouteMeta => { - const path = buildURLPath(); - const title = options.baseTitle; - const url = buildFullURL(options.baseUrl, path); - - return { - key: path, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME) - ]), - type: Layout.LIST, - metas: _.concat(createGoogleAnalyticsMeta(), createHomeMetas(), createCommonMetas(options)), - data: undefined - }; -}; - -// single item -export const createTagDetailRouteMeta = ( - rawTag: string, - contexts: ArticleContext[], - options?: Partial -): RouteMeta => { - const tagsOverviewRouteItem = createTagsOverviewRouteItem(); - const tagsOverviewRouteMeta = createTagsOverviewRouteMeta(contexts, options); - - const tagInfo = createTagDetailRouteItem(rawTag); - const path = buildURLPath(RoutePathPrefix.TAGS, tagInfo.id); - const title = buildTitle(tagInfo.label, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: rawTag, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createTagsOverviewBreadcrumbItem(options.baseUrl, tagsOverviewRouteItem.label, tagsOverviewRouteMeta.path), - createTagDetailBreadcrumbItem(options.baseUrl, rawTag, path) - ]), - metas: _.concat(createGoogleAnalyticsMeta(), createCommonMetas(options), createTagDetailMetas(rawTag)), - type: Layout.LIST, - data: undefined - }; -}; - -export const createCategoryDetailRouteMeta = ( - rawCategory: string, - contexts: ArticleContext[], - options?: Partial -): RouteMeta => { - const categoriesOverviewRouteItem = createCategoriesOverviewRouteItem(); - const categoriesOverviewRouteMeta = createCategoriesOverviewRouteMeta(contexts, options); - - const categoryInfo = createCategoryDetailRouteItem(rawCategory); - const path = buildURLPath(RoutePathPrefix.CATEGORIES, categoryInfo.id); - const title = buildTitle(categoryInfo.label, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: rawCategory, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createCategoriesOverviewBreadcrumbItem( - options.baseUrl, - categoriesOverviewRouteItem.label, - categoriesOverviewRouteMeta.path - ), - createCategoryDetailBreadcrumbItem(options.baseUrl, rawCategory, path) - ]), - type: Layout.LIST, - metas: _.concat(createGoogleAnalyticsMeta(), createCommonMetas(options), createCategoryDetailMetas(rawCategory)), - data: undefined - }; -}; - -export const createPostDetailRouteMeta = ( - article: ArticleContext, - contexts: ArticleContext[], - options?: Partial -): RouteMeta => { - const postsOverviewRouteItem = createPostsOverviewRouteItem(); - const postsOverviewRouteMeta = createPostsOverviewRouteMeta(contexts, options); - const created = new Date(article.created); - - const year = format(created, 'yyyy'); - const month = format(created, 'MM'); - const date = format(created, 'dd'); - const id = article.id; - - const path = buildURLPath(RoutePathPrefix.POSTS, year, month, date, id); - const title = buildTitle(article.title, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: article.id, - url: url, - path: path, - title: title, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createPostsOverviewBreadcrumbItem(options.baseUrl, postsOverviewRouteItem.label, postsOverviewRouteMeta.path), - createPostDetailBreadcrumbItem(options.baseUrl, article.title, path) - ]), - type: Layout.DETAIL, - metas: _.concat(createGoogleAnalyticsMeta(), createCommonMetas(options), createPostDetailMetas(article, options)), - data: undefined - }; -}; - -export const createPagesDetailRouteMeta = (article: ArticleContext, options?: Partial): RouteMeta => { - const path = buildURLPath(RoutePathPrefix.PAGES, article.id); - const title = buildTitle(article.title, options.baseTitle, options.titleSeparator); - const url = buildFullURL(options.baseUrl, path); - - return { - key: article.id, - path: path, - title: title, - url: url, - breadcrumbs: createBreadcrumbList([ - createHomeBreadcrumbItem(options.baseUrl, options.baseTitle, RoutePathPrefix.HOME), - createPageDetailBreadcrumbItem(options.baseUrl, article.title, path) - ]), - metas: _.concat(createGoogleAnalyticsMeta(), createCommonMetas(options), createPageDetailMetas(article, options)), - type: Layout.DETAIL, - data: undefined - }; -}; diff --git a/packages/router/src/post.route.util.ts b/packages/router/src/post.route.util.ts deleted file mode 100644 index 3c184f575..000000000 --- a/packages/router/src/post.route.util.ts +++ /dev/null @@ -1,94 +0,0 @@ -import * as _ from 'lodash'; -import * as path from 'path'; -import { Meta, MetaName, MetaValue, RoutePathPrefix } from '@blog/common/interfaces/routes'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { format } from 'date-fns'; -import { buildURLPath } from '@blog/common/utils/path.util'; -import { RoutesOptions } from '.'; - -export const buildPostPathFromContext = (context: ArticleContext) => { - // build link - const created = new Date(context.created); - const year = format(created, 'yyyy'); - const month = format(created, 'MM'); - const date = format(created, 'dd'); - const id = context.id; - return buildURLPath(RoutePathPrefix.POSTS, year, month, date, id); -}; - -export const buildPagePathFromContext = (context: ArticleContext) => { - return buildURLPath(RoutePathPrefix.PAGES, context.id); -}; - -export const createPostsOverviewRouteItem = () => ({ - id: RoutePathPrefix.POSTS, - label: 'Posts' // TODO: add i18n support -}); - -export const createPostDetailRouteItem = (context) => ({ - id: context.id, - label: context.title -}); - -export const createPostsOverviewDescMeta = (): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Posts` -}); - -export const createPostDetailDescMeta = (context: ArticleContext): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: context.summary -}); - -export const createPostDetailOpenGraphMetas = (context: ArticleContext, options: Partial): Meta[] => [ - { - name: MetaName.OPEN_GRAPH_TITLE, - itemprop: MetaName.NAME, - content: context.title - }, - { - name: MetaName.OPEN_GRAPH_DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: context.summary - }, - { - name: MetaName.OPEN_GRAPH_IMAGE, - itemprop: MetaName.IMAGE, - content: path.join(options.baseUrl, buildPostPathFromContext(context), context.cover) - }, - { - name: MetaName.OPEN_GRAPH_TYPE, - content: MetaValue.ARTICLE - } -]; - -export const createPageDetailOpenGraphMetas = (context: ArticleContext, options: Partial): Meta[] => [ - { - name: MetaName.OPEN_GRAPH_TITLE, - itemprop: MetaName.NAME, - content: context.title - }, - { - name: MetaName.OPEN_GRAPH_DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: context.summary - }, - { - name: MetaName.OPEN_GRAPH_IMAGE, - itemprop: MetaName.IMAGE, - content: path.join(options.baseUrl, buildPagePathFromContext(context), context.cover) - }, - { - name: MetaName.OPEN_GRAPH_TYPE, - content: MetaValue.ARTICLE - } -]; - -export const createPostsOverviewMetas = (): Meta[] => [createPostsOverviewDescMeta()]; -export const createPostDetailMetas = (context: ArticleContext, options: Partial): Meta[] => - _.concat(createPostDetailOpenGraphMetas(context, options), [createPostDetailDescMeta(context)]); - -export const createPageDetailMetas = (context: ArticleContext, options: Partial): Meta[] => - _.concat(createPageDetailOpenGraphMetas(context, options), [createPostDetailDescMeta(context)]); diff --git a/packages/router/src/sitemap.util.ts b/packages/router/src/sitemap.util.ts deleted file mode 100644 index e3a166303..000000000 --- a/packages/router/src/sitemap.util.ts +++ /dev/null @@ -1,171 +0,0 @@ -import * as _ from 'lodash'; -import * as path from 'path'; -import { SitemapStream, streamToPromise } from 'sitemap'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { - buildPostPathFromContext, - buildPagePathFromContext, - createCategoriesOverviewRouteMeta, - createCategoryDetailRouteMeta, - createHomeRouteMeta, - createPagesDetailRouteMeta, - createPostDetailRouteMeta, - createPostsOverviewRouteMeta, - createTagDetailRouteMeta, - createTagsOverviewRouteMeta, - RoutesOptions -} from './index'; -import { getAllCategoriesFromContexts, getAllTagsFromContexts } from '@blog/article'; - -const CHANGE_FREQ = 'daily'; -const DETAIL_PRIORITY = 1; -const LIST_PRIORITY = 0.5; -const KEYWORDS_SEPARATOR = ','; - -export const createHomeSitemapItem = (options?: Partial) => { - const routeMeta = createHomeRouteMeta(options); - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: LIST_PRIORITY, - lastmod: new Date().toISOString() - }; -}; - -export const createTagsOverviewSitemapItem = (contexts: ArticleContext[], options?: Partial) => { - const routeMeta = createTagsOverviewRouteMeta(contexts, options); - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: LIST_PRIORITY, - lastmod: new Date().toISOString() - }; -}; - -export const createCategoriesOverviewSitemapItem = (contexts: ArticleContext[], options?: Partial) => { - const routeMeta = createCategoriesOverviewRouteMeta(contexts, options); - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: LIST_PRIORITY, - lastmod: new Date().toISOString() - }; -}; - -export const createPostsOverviewSitemapItem = (contexts: ArticleContext[], options?: Partial) => { - const routeMeta = createPostsOverviewRouteMeta(contexts, options); - - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: LIST_PRIORITY, - lastmod: new Date().toISOString() - }; -}; - -export const createTagDetailSitemapItem = ( - rawTag: string, - contexts: ArticleContext[], - options?: Partial -) => { - const routeMeta = createTagDetailRouteMeta(rawTag, contexts, options); - - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: LIST_PRIORITY, - lastmod: new Date().toISOString() - }; -}; - -export const createCategoryDetailSitemapItem = ( - rawCategory: string, - contexts: ArticleContext[], - options?: Partial -) => { - const routeMeta = createCategoryDetailRouteMeta(rawCategory, contexts, options); - - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: LIST_PRIORITY, - lastmod: new Date().toISOString() - }; -}; -export const createPostDetailSitemapItem = ( - article: ArticleContext, - contexts: ArticleContext[], - options?: Partial -) => { - const routeMeta = createPostDetailRouteMeta(article, contexts, options); - - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: DETAIL_PRIORITY, - lastmod: new Date(article.updated).toISOString(), - keywords: article.tags.join(KEYWORDS_SEPARATOR), - img: _.map(article.images, (image) => ({ - url: path.join(buildPostPathFromContext(article), image) - })) - }; -}; - -export const createPageDetailSitemapItem = (article: ArticleContext, options?: Partial) => { - const routeMeta = createPagesDetailRouteMeta(article, options); - - return { - url: routeMeta.url, - changefreq: CHANGE_FREQ, - priority: DETAIL_PRIORITY, - lastmod: new Date(article.updated).toISOString(), - keywords: article.tags.join(KEYWORDS_SEPARATOR), - img: _.map(article.images, (image) => ({ - url: path.join(buildPagePathFromContext(article), image) - })) - }; -}; - -export const createSitemapContent = async ( - postContexts: ArticleContext[], - pageContexts: ArticleContext[], - options?: Partial -) => { - const sitemap = new SitemapStream({ hostname: options.baseUrl }); - - const homeSitemapItem = createHomeSitemapItem(options); - const postsOverviewSitemapItem = createPostsOverviewSitemapItem(postContexts, options); - const categoriesOverviewSitemapItem = createCategoriesOverviewSitemapItem(postContexts, options); - const tagsOverviewSitemapItem = createTagsOverviewSitemapItem(postContexts, options); - - const allCategories = getAllCategoriesFromContexts(postContexts); - const allTags = getAllTagsFromContexts(postContexts); - - const categoryDetailSitemapItems = _.map(allCategories, (category) => - createCategoryDetailSitemapItem(category, postContexts, options) - ); - const tagDetailSitemapItems = _.map(allTags, (tag) => createTagDetailSitemapItem(tag, postContexts, options)); - const postDetailSitemapItems = _.map(postContexts, (article) => - createPostDetailSitemapItem(article, postContexts, options) - ); - const pageDetailSitemapItems = _.map(pageContexts, (article) => createPageDetailSitemapItem(article, options)); - - const allSitemapItems = _.concat( - [homeSitemapItem], - [postsOverviewSitemapItem], - [categoriesOverviewSitemapItem], - [tagsOverviewSitemapItem], - categoryDetailSitemapItems, - tagDetailSitemapItems, - postDetailSitemapItems, - pageDetailSitemapItems - ); - - _.each(allSitemapItems, (sitemapItem) => { - sitemap.write(sitemapItem); - }); - - sitemap.end(); - - return await streamToPromise(sitemap); -}; diff --git a/packages/router/src/tag.route.util.ts b/packages/router/src/tag.route.util.ts deleted file mode 100644 index be832d5ea..000000000 --- a/packages/router/src/tag.route.util.ts +++ /dev/null @@ -1,37 +0,0 @@ -import * as _ from 'lodash'; -import * as uslug from 'uslug'; -import { Meta, MetaName, RoutePathPrefix } from '@blog/common/interfaces/routes'; - -export const createTagsOverviewRouteItem = () => ({ - id: RoutePathPrefix.TAGS, - label: `Tags` // TODO: add i18n support -}); - -export const createTagDetailRouteItem = (rawTag: string) => ({ - id: uslug(rawTag), - label: `Tag: ${rawTag}` -}); - -export const createTagsOverviewDescMeta = (): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Tags` -}); - -export const createTagDetailDescMeta = (rawTag: string): Meta => ({ - name: MetaName.DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Tag: ${rawTag}` -}); - -export const createTagDetailOpenGraphMetas = (rawTag: string): Meta[] => [ - { - name: MetaName.OPEN_GRAPH_DESCRIPTION, - itemprop: MetaName.DESCRIPTION, - content: `Tag: ${rawTag}` - } -]; - -export const createTagsOverviewMetas = (): Meta[] => [createTagsOverviewDescMeta()]; -export const createTagDetailMetas = (rawTag: string): Meta[] => - _.concat(createTagDetailOpenGraphMetas(rawTag), [createTagDetailDescMeta(rawTag)]); diff --git a/packages/router/tsconfig.json b/packages/router/tsconfig.json deleted file mode 100644 index f1a0e1870..000000000 --- a/packages/router/tsconfig.json +++ /dev/null @@ -1,9 +0,0 @@ -{ - "extends": "../tsconfig.lib.json", - "compilerOptions": { - "outDir": "./dist", - "baseUrl": "./" - }, - "include": ["src"], - "exclude": ["node_modules", "dist", "reports", "src/__tests__"] -} diff --git a/service/package.json b/service/package.json deleted file mode 100644 index 781a54518..000000000 --- a/service/package.json +++ /dev/null @@ -1,92 +0,0 @@ -{ - "name": "@blog/service", - "version": "6.26.198", - "description": "core build service", - "author": "Aquariuslt ", - "homepage": "https://github.com/aquariuslt/blog", - "license": "MIT", - "private": true, - "repository": { - "type": "git", - "url": "git+https://github.com/aquariuslt/blog.git" - }, - "scripts": { - "clean": "rimraf dist", - "prebuild:service:prod": "yarn run build:service", - "build:service:prod": "cross-env NODE_ENV=production node -r tsconfig-paths/register -r ts-node/register dist/main", - "prebuild:service": "yarn clean", - "build:service": "nest build", - "start:service": "nest start", - "test": "jest", - "test:cov": "jest --coverage" - }, - "dependencies": { - "@blog/api": "^6.26.198", - "@blog/article": "^6.26.198", - "@blog/common": "^6.26.198", - "@blog/config": "^6.26.198", - "@blog/markdown": "^6.26.198", - "@blog/pwa": "^6.26.198", - "@blog/router": "^6.26.198", - "@nestjs/common": "7.6.13", - "@nestjs/core": "7.6.13", - "@nestjs/platform-express": "7.6.13", - "@nestjs/serve-static": "2.1.4", - "class-transformer": "0.4.0", - "class-validator": "0.13.1", - "cosmiconfig": "7.0.0", - "date-fns": "2.19.0", - "fancy-log": "1.3.3", - "fs-extra": "9.1.0", - "get-port": "5.1.1", - "lodash": "4.17.21", - "log4js": "6.3.0", - "puppeteer": "5.5.0", - "reflect-metadata": "0.1.13", - "rxjs": "6.6.6" - }, - "devDependencies": { - "@nestjs/cli": "7.5.6", - "@nestjs/schematics": "7.2.8", - "@nestjs/testing": "7.6.13", - "@types/express": "4.17.11", - "@types/fancy-log": "1.3.1", - "@types/fs-extra": "9.0.7", - "@types/jest": "26.0.20", - "@types/lodash": "4.14.168", - "@types/node": "13.13.45", - "@types/puppeteer": "5.4.3", - "@types/supertest": "2.0.10", - "cross-env": "7.0.3", - "jest": "26.6.3", - "supertest": "6.1.3", - "ts-jest": "26.5.3", - "ts-node": "9.1.1", - "tsc-watch": "4.2.9", - "tsconfig-paths": "3.9.0", - "typescript": "4.0.5" - }, - "jest": { - "moduleFileExtensions": [ - "ts", - "js", - "json" - ], - "moduleNameMapper": { - "^@/(.*)$": "/src/$1" - }, - "transform": { - "^.+\\.ts$": "ts-jest", - "^.*\\.md$": "jest-raw-loader" - }, - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "testEnvironment": "node", - "coverageDirectory": "/coverage" - } -} diff --git a/service/src/api/api.service.ts b/service/src/api/api.service.ts deleted file mode 100644 index 6a38de6fb..000000000 --- a/service/src/api/api.service.ts +++ /dev/null @@ -1,181 +0,0 @@ -import * as _ from 'lodash'; -import { Injectable, Logger, OnModuleInit } from '@nestjs/common'; -import { ConfigService } from '@/config/config.service'; -import { ArticleService } from '@/article/article.service'; -import { RoutesService } from '@/routes/routes.service'; -import { RoutePathPrefix } from '@blog/common/interfaces/routes'; -import { ApiData } from '@blog/common/interfaces/api'; -import { - createCategoriesOverviewApiData, - createCategoryDetailApiData, - createPostDetailApiData, - createPostsOverviewApiData, - createTagDetailApiData, - createTagsOverviewApiData, - createNavigationApiData, - persistApi, - createProfileApiData, - createPageDetailApiData, - createPageNavigationItem -} from '@blog/api'; -import { buildURLPath } from '@blog/common/utils/path.util'; - -@Injectable() -export class ApiService implements OnModuleInit { - private readonly logger = new Logger(ApiService.name); - - private $inited; - - public apiMap; - private apis: Partial[]; - - private home: ApiData; - private navigation: Partial; - private profile: Partial; - - private tagsOverview: ApiData; - private tagDetails: ApiData[]; - - private categoriesOverview: ApiData; - private categoryDetails: ApiData[]; - - private postsOverview: ApiData; - private postDetails: ApiData[]; - - private pageDetails: ApiData[]; - - constructor( - private readonly config: ConfigService, - private readonly article: ArticleService, - private readonly routes: RoutesService - ) {} - - async onModuleInit() { - if (this.$inited) { - return; - } - this.routes.onModuleInit(); - this.buildApi(); - this.$inited = true; - } - - buildApi() { - this.home = this.buildHomeApi(); - - this.tagsOverview = this.buildTagsOverviewApi(); - this.categoriesOverview = this.buildCategoriesOverviewApi(); - this.postsOverview = this.buildPostsOverviewApi(); - - this.tagDetails = this.buildTagDetailsApi(); - this.categoryDetails = this.buildCategoryDetailsApi(); - this.postDetails = this.buildPostDetailsApi(); - this.pageDetails = this.buildPageDetailsApi(); - - this.navigation = this.buildNavigationApi(); - this.profile = this.buildProfileApi(); - - this.apis = _.concat( - [this.home], - [this.tagsOverview], - [this.categoriesOverview], - [this.postsOverview], - [this.navigation], - [this.profile], - this.tagDetails, - this.categoryDetails, - this.postDetails, - this.pageDetails - ); - - this.apiMap = _.keyBy(this.apis, 'path'); - - _.each(this.apis, (api) => { - this.logger.log(`Persisting Api for path: ${api.path}`); - persistApi(api.path, api, this.config.dirs.api); - }); - } - - buildHomeApi() { - const HOME_POSTS_DISPLAY_LENGTH = 10; - return _.merge({}, this.routes.home, { - path: buildURLPath(RoutePathPrefix.HOME_ALIAS), - data: createPostsOverviewApiData(this.article.postContexts).filter( - (post, index) => index < HOME_POSTS_DISPLAY_LENGTH - ) - }); - } - - buildTagsOverviewApi() { - return _.merge({}, this.routes.tagsOverview, { - data: createTagsOverviewApiData(this.article.postContexts) - }); - } - - buildTagDetailsApi() { - const tagDetails = this.routes.tagDetails; - return _.map(tagDetails, (tagDetail) => { - return _.merge({}, tagDetail, { - data: createTagDetailApiData(tagDetail.key, this.article.postContexts) - }); - }); - } - - buildCategoriesOverviewApi() { - return _.merge({}, this.routes.categoriesOverview, { - data: createCategoriesOverviewApiData(this.article.postContexts) - }); - } - - buildCategoryDetailsApi() { - const categoryDetails = this.routes.categoryDetails; - return _.map(categoryDetails, (categoryDetail) => { - return _.merge({}, categoryDetail, { - data: createCategoryDetailApiData(categoryDetail.key, this.article.postContexts) - }); - }); - } - - buildPostsOverviewApi() { - return _.merge({}, this.routes.postsOverview, { - data: createPostsOverviewApiData(this.article.postContexts) - }); - } - - buildPostDetailsApi() { - return _.map(this.routes.postDetails, (postDetail) => { - const data = _.merge({}, createPostDetailApiData(postDetail.key, this.article.postContexts), { - disqus: { - shortname: this.config.site.disqus, - url: postDetail.url, - identifier: postDetail.path.replace(/\//g, '-') - } - }); - - return _.merge({}, postDetail, { data }); - }); - } - - buildPageDetailsApi() { - return _.map(this.routes.pageDetails, (pageDetail) => { - const context = _.find(this.article.pageContexts, { id: pageDetail.key }); - const data = _.merge({}, createPageDetailApiData(context)); - return _.merge({}, pageDetail, { data }); - }); - } - - buildNavigationApi() { - return { - path: buildURLPath(RoutePathPrefix.NAVIGATION), - data: createNavigationApiData().concat( - _.map(this.article.pageContexts, (context) => createPageNavigationItem(context)) - ) - }; - } - - buildProfileApi() { - return { - path: buildURLPath(RoutePathPrefix.PROFILE), - data: createProfileApiData(this.config.profile) - }; - } -} diff --git a/service/src/app.module.ts b/service/src/app.module.ts deleted file mode 100644 index 8c58f3edc..000000000 --- a/service/src/app.module.ts +++ /dev/null @@ -1,25 +0,0 @@ -import { Module } from '@nestjs/common'; -import { ConfigModule } from '@/config/config.module'; -import { ArticleModule } from '@/article/article.module'; -import { RoutesModule } from '@/routes/routes.module'; -import { ApiModule } from './api/api.module'; -import { ThemeModule } from './theme/theme.module'; -import { RenderModule } from './render/render.module'; -import { RenderServerModule } from '@/render/render-server.module'; -import { LoggerModule } from './logger/logger.module'; - -@Module({ - imports: [ - ConfigModule, - ArticleModule, - RoutesModule, - ApiModule, - ThemeModule, - RenderModule, - RenderServerModule, - LoggerModule - ], - providers: [], - exports: [] -}) -export class AppModule {} diff --git a/service/src/logger/custom.logger.ts b/service/src/logger/custom.logger.ts deleted file mode 100644 index 8196624d6..000000000 --- a/service/src/logger/custom.logger.ts +++ /dev/null @@ -1,66 +0,0 @@ -import { addLayout, configure, getLogger, Logger } from 'log4js'; -import { Injectable, Optional, Logger as NestLogger } from '@nestjs/common'; - -@Injectable() -export class CustomLogger extends NestLogger { - private internalLogger: Logger; - - constructor(@Optional() context?: string, @Optional() isTimestampEnabled = false) { - super(context, isTimestampEnabled); - this.configure(); - } - - private configure() { - addLayout('json', () => { - return (logEvent) => - JSON.stringify({ - timestamp: logEvent.startTime, - pid: logEvent.pid, - level: logEvent.level.levelStr, - message: logEvent.data[1], - context: logEvent.data[0] || 'No-Context' - }); - }); - - configure({ - appenders: { - stdout: { - type: 'stdout' - }, - file: { - type: 'dateFile', - filename: 'logs/app.log', // TODO: Update to Configured Value: Logging - pattern: '.yyyy-MM-dd-hh', - compress: true, - layout: { - type: 'json' - } - } - }, - categories: { - default: { - appenders: ['stdout', 'file'], - level: 'trace' - } - } - }); - - this.internalLogger = getLogger('App'); - } - - debug(message: any, context?: string) { - this.internalLogger.debug(context, message); - } - - log(message: string, context?: string) { - this.internalLogger.info(context, message); - } - - warn(message: any, context?: string) { - this.internalLogger.warn(context, message); - } - - error(message: any, trace?: string, context?: string) { - this.internalLogger.error(context, message, trace); - } -} diff --git a/service/src/main.ts b/service/src/main.ts deleted file mode 100644 index 59f1eab65..000000000 --- a/service/src/main.ts +++ /dev/null @@ -1,13 +0,0 @@ -import { NestFactory } from '@nestjs/core'; -import { AppModule } from '@/app.module'; -import { CustomLogger } from '@/logger/custom.logger'; - -async function bootstrap() { - const port = 2999; - const app = await NestFactory.create(AppModule, { logger: false }); - app.useLogger(app.get(CustomLogger)); - await app.listen(port); - await app.close(); -} - -bootstrap().then(); diff --git a/service/src/theme/theme.service.ts b/service/src/theme/theme.service.ts deleted file mode 100644 index fb234529d..000000000 --- a/service/src/theme/theme.service.ts +++ /dev/null @@ -1,125 +0,0 @@ -import * as _ from 'lodash'; -import * as path from 'path'; -import * as fse from 'fs-extra'; -import { Injectable, Logger, OnModuleInit } from '@nestjs/common'; -import { ConfigService } from '@/config/config.service'; -import { ArticleService } from '@/article/article.service'; -import { RoutesService } from '@/routes/routes.service'; -import { ApiService } from '@/api/api.service'; -import { RenderService } from '@/render/render.service'; -import { createSitemapContent } from '@blog/router'; -import { persistFile } from '@blog/api'; -import { buildServiceWorker } from '@blog/pwa'; - -@Injectable() -export class ThemeService implements OnModuleInit { - private readonly logger = new Logger(ThemeService.name); - - constructor( - private readonly config: ConfigService, - private readonly article: ArticleService, - private readonly routes: RoutesService, - private readonly api: ApiService, - private readonly renderer: RenderService - ) {} - - async onModuleInit() { - await this.routes.onModuleInit(); - await this.api.onModuleInit(); - await this.renderer.onModuleInit(); - this.logger.log(`Detecting Theme: ${this.config.theme}`); - this.buildCNAME(); - this.buildNoJekyll(); - this.buildSitemap(); - if (process.env.NODE_ENV === 'production') { - this.buildThemeAssets(); - this.buildFallbackHtml(); - await this.prerender(); - await this.buildPWAAssets(); - await this.buildNowConfig(); - } - } - - private buildCNAME() { - const CNAME = this.config.site.baseUrl; - this.logger.log(`Persisting CNAME: ${CNAME}`); - persistFile(`CNAME`, CNAME, this.config.dirs.dest); - } - - private buildNoJekyll() { - this.logger.log(`Persisting .nojekyll`); - persistFile(`.nojekyll`, null, this.config.dirs.dest); - } - - private buildThemeAssets() { - const themeDestDir = path.join(this.config.theme, `dist`); - const targetDir = this.config.dirs.dest; - - this.logger.log(`Copying theme assets from ${themeDestDir} to ${targetDir}`); - fse.copySync(themeDestDir, targetDir); - } - - private buildFallbackHtml() { - this.logger.log(`Copy index.html as fallback 404.html`); - const indexHtml = path.join(this.config.dirs.dest, `index.html`); - const fallbackHtml = path.join(this.config.dirs.dest, `404.html`); - fse.copySync(indexHtml, fallbackHtml); - } - - private async buildSitemap() { - const sitemap = await createSitemapContent( - this.article.postContexts, - this.article.pageContexts, - this.routes.routesOptions - ); - this.logger.log(`Persisting Sitemap`); - persistFile(`sitemap.xml`, sitemap.toString(), this.config.dirs.dest); - } - - private async buildPWAAssets() { - this.logger.log(`Build Service Worker related files to ${this.config.dirs.dest}`); - await buildServiceWorker(this.config.dirs.dest); - this.logger.log(`Build Service Worker complete;`); - } - - private async buildNowConfig() { - this.logger.log(`Build now.json for static routing`); - const nowConfiguration = { - version: 2, - trailingSlash: true - }; - fse.writeFileSync(path.join(this.config.dirs.dest, `/now.json`), JSON.stringify(nowConfiguration)); - this.logger.log(`Build now.json complete`); - } - - private async prerender() { - const prerenderTasks = this.routes.routes - .filter((route) => route.path !== '/') - .map((route) => { - return new Promise((resolve) => { - this.captureAndSaveRoute(route.path) - .then(() => { - resolve(); - }) - .catch((error) => { - this.logger.error(error.message); - }); - }); - }); - - await Promise.all(prerenderTasks); - - await this.captureAndSaveRoute('/'); - } - - private async captureAndSaveRoute(p: string) { - const htmlPath = path.join(this.config.dirs.dest, p, `/index.html`); - if (p !== '/') { - fse.removeSync(htmlPath); - } - fse.ensureDirSync(path.dirname(htmlPath)); - const html = await this.renderer.render(p); - this.logger.log(`Pre-rendering html content to ${htmlPath}`); - fse.writeFileSync(htmlPath, html); - } -} diff --git a/themes/theme-react/package.json b/themes/theme-react/package.json deleted file mode 100644 index 4c9fcafae..000000000 --- a/themes/theme-react/package.json +++ /dev/null @@ -1,122 +0,0 @@ -{ - "name": "@blog/theme-react", - "version": "6.26.198", - "private": true, - "license": "MIT", - "scripts": { - "test": "jest", - "prebuild:theme": "gulp clean", - "build:theme": "gulp build", - "serve": "gulp serve" - }, - "dependencies": { - "@blog/common": "^6.26.198", - "@blog/config": "^6.26.198", - "@loadable/component": "5.14.1", - "@material-ui/core": "4.11.3", - "@material-ui/icons": "4.11.2", - "axios": "0.21.1", - "classnames": "2.2.6", - "clsx": "1.1.1", - "date-fns": "2.19.0", - "github-markdown-css": "4.0.0", - "highlight.js": "10.6.0", - "notistack": "1.0.5", - "react": "17.0.1", - "react-disqus-components": "1.2.3", - "react-dom": "17.0.1", - "react-helmet": "6.1.0", - "react-router-dom": "5.2.0", - "register-service-worker": "1.7.2", - "scroll-into-view-if-needed": "2.2.27", - "typeface-roboto": "1.1.13", - "vanilla-lazyload": "17.3.1" - }, - "devDependencies": { - "@testing-library/react": "11.2.5", - "@types/gulp": "4.0.8", - "@types/jest": "26.0.20", - "@types/node": "13.13.45", - "@types/react": "17.0.3", - "@types/react-dom": "17.0.1", - "@types/react-router-dom": "5.1.7", - "@types/webpack": "4.41.26", - "autoprefixer": "10.2.5", - "clean-webpack-plugin": "3.0.0", - "css-loader": "5.1.1", - "disqus-react": "1.0.11", - "enzyme": "3.11.0", - "fancy-log": "1.3.3", - "favicons-webpack-plugin": "4.2.0", - "file-loader": "6.2.0", - "friendly-errors-webpack-plugin": "1.7.0", - "gulp": "4.0.2", - "gulp-rimraf": "1.0.0", - "html-webpack-plugin": "4.5.2", - "jest": "26.6.3", - "jest-environment-enzyme": "7.1.2", - "jest-enzyme": "7.1.2", - "jest-properties-loader": "1.0.8", - "jest-raw-loader": "1.0.1", - "jest-transform-stub": "2.0.0", - "lodash": "4.17.21", - "mini-css-extract-plugin": "1.3.9", - "mkdirp": "1.0.4", - "optimize-css-assets-webpack-plugin": "5.0.4", - "postcss": "8.2.7", - "postcss-load-config": "3.0.1", - "postcss-loader": "5.1.0", - "postcss-pxtorem": "6.0.0", - "preload-webpack-plugin": "3.0.0-beta.4", - "properties-json-loader": "2.2.2", - "robotstxt-webpack-plugin": "7.0.0", - "script-ext-html-webpack-plugin": "2.1.5", - "style-loader": "2.0.0", - "terser-webpack-plugin": "4.2.3", - "ts-jest": "26.5.3", - "ts-loader": "8.0.17", - "ts-node": "9.1.1", - "tsconfig-paths-webpack-plugin": "3.3.0", - "typescript": "4.0.5", - "uglifyjs-webpack-plugin": "2.2.0", - "url-loader": "4.1.1", - "webpack": "4.46.0", - "webpack-bundle-analyzer": "4.4.0", - "webpack-dev-middleware": "4.1.0", - "webpack-dev-server": "3.11.2", - "webpack-hot-middleware": "2.25.0", - "webpack-merge": "4.2.2" - }, - "postcss": { - "plugins": { - "autoprefixer": {} - } - }, - "jest": { - "moduleFileExtensions": [ - "js", - "jsx", - "json", - "ts", - "tsx" - ], - "moduleNameMapper": { - "^@theme-react/(.*)$": "/src/$1" - }, - "transform": { - "^.+\\.tsx?$": "ts-jest", - ".+\\.(css|styl|less|sass|scss|svg|png|jpg|ttf|woff|woff2)$": "jest-transform-stub" - }, - "transformIgnorePatterns": [ - "/node_modules/" - ], - "collectCoverageFrom": [ - "!**/__tests__/**", - "/src/**/*.ts" - ], - "testMatch": [ - "/src/**/*.test.ts" - ], - "coverageDirectory": "/coverage" - } -} diff --git a/themes/theme-react/src/App.tsx b/themes/theme-react/src/App.tsx deleted file mode 100644 index 426f1bdfc..000000000 --- a/themes/theme-react/src/App.tsx +++ /dev/null @@ -1,47 +0,0 @@ -import * as React from 'react'; -import { useEffect, useState } from 'react'; -import { BrowserRouter as Router } from 'react-router-dom'; -import { ThemeProvider } from '@material-ui/core/styles'; -import { RouterView } from '@theme-react/router'; -import { Navigation } from '@theme-react/components/Navigation'; -import { EmptyProfile } from '@blog/common/interfaces/profile'; -import { RoutePathPrefix } from '@blog/common/interfaces/routes'; -import { SnackbarProvider } from 'notistack'; -import { buildURLPath } from '@blog/common/utils/path.util'; -import { loadApi } from '@theme-react/api'; -import { theme } from '@theme-react/constants'; -import { ServiceWorkerNotification } from '@theme-react/components/ServiceWorkerNotification'; - -export const App: React.FC = () => { - const [navigationItems, setNavigationItems] = useState([]); - const [profile, setProfile] = useState(EmptyProfile); - const [title] = React.useState(process.env.BASE_TITLE || ''); - - const loadNavigation = async () => { - const navigation = await loadApi(buildURLPath(RoutePathPrefix.NAVIGATION)); - setNavigationItems(navigation.data); - }; - - const loadProfile = async () => { - const profile = await loadApi(buildURLPath(RoutePathPrefix.PROFILE)); - setProfile(profile.data); - }; - - useEffect(() => { - loadNavigation(); - loadProfile(); - }, []); - - return ( - - - - - - - - - - - ); -}; diff --git a/themes/theme-react/src/components/ArticleCard.tsx b/themes/theme-react/src/components/ArticleCard.tsx deleted file mode 100644 index d194aeb72..000000000 --- a/themes/theme-react/src/components/ArticleCard.tsx +++ /dev/null @@ -1,53 +0,0 @@ -import * as React from 'react'; -import { format, parseISO } from 'date-fns'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import { Link as RouterLink } from 'react-router-dom'; -import { CARD_MAX_WIDTH } from '@theme-react/constants'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { LazyImage } from '@theme-react/components/LazyImage'; -import Card from '@material-ui/core/Card'; -import CardActionArea from '@material-ui/core/CardActionArea'; -import CardContent from '@material-ui/core/CardContent'; -import Typography from '@material-ui/core/Typography'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - card: { - margin: theme.spacing(1), - maxWidth: CARD_MAX_WIDTH, - [theme.breakpoints.down('sm')]: { - margin: theme.spacing(1, 0, 0, 0) - } - }, - media: { - width: '100%', - maxWidth: '100%' - }, - date: { - marginBottom: theme.spacing(1) - } - }) -); - -export const ArticleCard: React.FC> = (props) => { - const classes = useStyles(); - - return ( - - - - - - {props.title} - - - {format(parseISO(props.created || ''), 'yyyy-MM-dd')} - - - {props.summary} - - - - - ); -}; diff --git a/themes/theme-react/src/components/ArticleDetail.tsx b/themes/theme-react/src/components/ArticleDetail.tsx deleted file mode 100644 index c90290f0d..000000000 --- a/themes/theme-react/src/components/ArticleDetail.tsx +++ /dev/null @@ -1,54 +0,0 @@ -import '@theme-react/markdown.css'; -import * as React from 'react'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import { CARD_MAX_WIDTH } from '@theme-react/constants'; -import { Comment } from 'react-disqus-components'; -import { KeywordChip } from '@theme-react/components/KeywordChip'; -import { ViewsShow } from '@theme-react/components/ViewsShow'; -import { LazyImage } from '@theme-react/components/LazyImage'; -import Typography from '@material-ui/core/Typography'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - margin: theme.spacing(0), - maxWidth: CARD_MAX_WIDTH, - width: '100%', - padding: theme.spacing(0, 2), - [theme.breakpoints.down('sm')]: { - padding: theme.spacing(1) - } - }, - cover: { - width: '100%', - maxWidth: '100%' - }, - divider: { - marginTop: theme.spacing(2) - } - }) -); - -export const ArticleDetail: React.FC> = (props) => { - const classes = useStyles(); - - return ( -
- - - -
- - {props.id && } - {props.tags && (props.tags as any).map((keyword) => )} - -
- ); -}; diff --git a/themes/theme-react/src/components/BreadcrumbList.tsx b/themes/theme-react/src/components/BreadcrumbList.tsx deleted file mode 100644 index bd949bcf4..000000000 --- a/themes/theme-react/src/components/BreadcrumbList.tsx +++ /dev/null @@ -1,65 +0,0 @@ -import * as React from 'react'; -import { Link as RouterLink } from 'react-router-dom'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import { BreadcrumbList as BreadcrumbListProps } from '@blog/common/interfaces/routes/breadcrumb'; -import { CARD_MAX_WIDTH } from '@theme-react/constants'; -import NavigateNextIcon from '@material-ui/icons/NavigateNext'; -import Paper from '@material-ui/core/Paper'; -import Link from '@material-ui/core/Link'; -import Breadcrumbs from '@material-ui/core/Breadcrumbs'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - justifyContent: 'center', - flexWrap: 'wrap', - width: '100%', - maxWidth: CARD_MAX_WIDTH - }, - paper: { - backgroundColor: theme.palette.background.paper, - padding: theme.spacing(1, 2), - [theme.breakpoints.down('sm')]: { - padding: theme.spacing(1) - } - }, - breadcrumbs: { - '& > ol': { - flexWrap: 'nowrap' - }, - '& > ol > li:last-child': { - overflowY: 'hidden', - textOverflow: 'ellipsis', - whiteSpace: 'nowrap' - } - } - }) -); - -export const BreadcrumbList: React.FC = (props) => { - const classes = useStyles(); - const breadcrumbItems = props.itemListElement || []; - - return ( -
- - } - aria-label="breadcrumb" - className={classes.breadcrumbs} - > - {breadcrumbItems.map((item, index) => ( - - {item.name} - - ))} - - -
- ); -}; diff --git a/themes/theme-react/src/components/ContentItems.tsx b/themes/theme-react/src/components/ContentItems.tsx deleted file mode 100644 index cf4d902d9..000000000 --- a/themes/theme-react/src/components/ContentItems.tsx +++ /dev/null @@ -1,201 +0,0 @@ -import * as React from 'react'; -import clsx from 'clsx'; -import throttle from 'lodash/throttle'; -import noop from 'lodash/noop'; -import scrollIntoView from 'scroll-into-view-if-needed'; -import { createStyles, makeStyles, Theme, useTheme } from '@material-ui/core/styles'; -import { ContentItem } from '@blog/common/interfaces/articles/content-item'; -import { DRAWER_WIDTH } from '@theme-react/constants'; -import Typography from '@material-ui/core/Typography'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - width: DRAWER_WIDTH, - flexShrink: 0, - order: 2, - position: 'fixed', - overflowX: 'hidden', - overflowY: 'auto', - overflowWrap: 'break-word', - padding: theme.spacing(1), - display: 'none', - [theme.breakpoints.up('md')]: { - display: 'block' - }, - listStyleType: 'none' - }, - contents: { - marginTop: theme.spacing(2), - paddingLeft: theme.spacing(1.5) - }, - title: { - fontSize: 14, - padding: theme.spacing(1, 0, 0.5, 1), - boxSizing: 'content-box' - }, - ol: { - padding: 0, - margin: 0, - listStyleType: 'none' - }, - item: { - fontSize: 13, - padding: theme.spacing(0.5, 0, 0.5, 1), - borderLeft: '4px solid transparent', - boxSizing: 'content-box', - '&:hover': { - borderLeft: `4px solid ${theme.palette.grey[200]}`, - cursor: 'pointer' - }, - '&$active,&:active': { - borderLeft: `4px solid ${theme.palette.grey[400]}` - } - }, - active: {} - }) -); - -export interface ContentItemsProps { - items: ContentItem[]; -} - -const useThrottledOnScroll = (callback, delay) => { - const throttledCallback = React.useMemo(() => (callback ? throttle(callback, delay) : noop), [callback, delay]); - - React.useEffect(() => { - if (throttledCallback === noop) { - return undefined; - } - - window.addEventListener('scroll', throttledCallback); - return () => { - window.removeEventListener('scroll', throttledCallback); - throttledCallback.cancel(); - }; - }, [throttledCallback]); -}; - -const setHash = (id) => { - if (!history.pushState) { - return; - } - - history.pushState( - { - anchor: id - }, - document.title, - `#${id}` - ); -}; - -export const ContentItems: React.FC = (props) => { - const EMPTY_TIMEOUT_ID = -1; - const theme = useTheme(); - const classes = useStyles(); - const [activeState, setActiveState] = React.useState(null); - const [afterClick, setAfterClick] = React.useState(false); - const [lastClickTimeout, setLastClickTimeout] = React.useState(EMPTY_TIMEOUT_ID); - - const scrollTo = (id: string) => () => { - const SCROLL_DURATION = 4000; - const SCROLL_ANIMATION_DURATION = 100; - - setAfterClick(true); - - if (lastClickTimeout !== EMPTY_TIMEOUT_ID) { - clearTimeout(lastClickTimeout); - } - - if (activeState !== id) { - setActiveState(id); - } - - scrollIntoView(document.getElementById(id) as Element, { - behavior: 'smooth' - }); - setHash(id); - - const lastTimeoutId = setTimeout(() => { - setAfterClick(false); - setLastClickTimeout(EMPTY_TIMEOUT_ID); - }, SCROLL_DURATION + SCROLL_ANIMATION_DURATION); - - setLastClickTimeout(lastTimeoutId); - }; - - const collectAllIds = (rootItem: ContentItem) => { - const collectItemIds = (item) => { - if (item && item.children) { - let ids = [item.id]; - item.children.forEach((child) => { - ids = ids.concat(collectItemIds(child)); - }); - return ids; - } else { - return [item.id]; - } - }; - - return collectItemIds(rootItem); - }; - - const findActiveIndex = React.useCallback(() => { - if (props.items.length <= 0) { - return; - } - - const ids = collectAllIds(props.items[0]); - - let activeNode; - for (let i = ids.length - 1; i >= 0; i--) { - // No hash if we're near the top of the page - if (document.documentElement.scrollTop < 200) { - break; - } - - const checkingNode = document.getElementById(ids[i]); - if ( - checkingNode && - checkingNode.offsetTop < document.documentElement.scrollTop + document.documentElement.clientHeight / 8 - ) { - activeNode = checkingNode; - break; - } - } - - if (activeNode && activeState !== activeNode.id) { - setActiveState(activeNode.id); - } - }, [props.items]); - - useThrottledOnScroll(!afterClick ? findActiveIndex : null, 166); - - const ContentLink: React.FC = (item) => { - const isTitle = item.position === 0; - - return ( - - {isTitle ? 'Contents' : item.label} - - ); - }; - - return ( - - ); -}; diff --git a/themes/theme-react/src/components/LazyImage.tsx b/themes/theme-react/src/components/LazyImage.tsx deleted file mode 100644 index b04cd9f71..000000000 --- a/themes/theme-react/src/components/LazyImage.tsx +++ /dev/null @@ -1,41 +0,0 @@ -import * as React from 'react'; -import clsx from 'clsx'; -import LazyLoad from 'vanilla-lazyload'; -import placeholder from '@theme-react/imgs/placeholder.png'; -import { useEffect } from 'react'; - -export interface LazyImageProps { - image?: string; - alt?: string; - lazy?: boolean; - [key: string]: any; -} - -const PNG_EXTENSION = '.png'; -const WEBP_EXTENSION = '.webp'; - -export const LazyImage: React.FC = (props) => { - useEffect(() => { - const lazyLoadInstance = new LazyLoad({ - elements_selector: '.lazy' - }); - - lazyLoadInstance.update(); - }, [props.image]); - - const webpImage = props.image ? props.image.replace(PNG_EXTENSION, WEBP_EXTENSION) : ''; - - const shouldLazy = props.lazy !== false; - - return ( - - - {props.alt} - - ); -}; diff --git a/themes/theme-react/src/components/Navigation.tsx b/themes/theme-react/src/components/Navigation.tsx deleted file mode 100644 index 9f552b6a8..000000000 --- a/themes/theme-react/src/components/Navigation.tsx +++ /dev/null @@ -1,204 +0,0 @@ -import * as React from 'react'; -import clsx from 'clsx'; -import useScrollTrigger from '@material-ui/core/useScrollTrigger'; -import { Link as RouterLink } from 'react-router-dom'; -import { Profile } from '@blog/common/interfaces/profile'; -import { NavigationItem } from '@blog/common/interfaces/navigation'; -import { Icon } from '@theme-react/components/Icon'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import CssBaseline from '@material-ui/core/CssBaseline'; -import AppBar from '@material-ui/core/AppBar'; -import Toolbar from '@material-ui/core/Toolbar'; -import IconButton from '@material-ui/core/IconButton'; -import Typography from '@material-ui/core/Typography'; -import Drawer from '@material-ui/core/Drawer'; -import Divider from '@material-ui/core/Divider'; -import List from '@material-ui/core/List'; -import ListItem from '@material-ui/core/ListItem'; -import ListItemText from '@material-ui/core/ListItemText'; -import ListItemIcon from '@material-ui/core/ListItemIcon'; -import Zoom from '@material-ui/core/Zoom'; -import Fab from '@material-ui/core/Fab'; -import MenuIcon from '@material-ui/icons/Menu'; -import ChevronLeftIcon from '@material-ui/icons/ChevronLeft'; - -export interface NavigationProps { - title: string; - profile: Profile; - menus: NavigationItem[]; -} - -const drawerWidth = 240; -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - display: 'flex' - }, - appBar: { - transition: theme.transitions.create(['margin', 'width'], { - easing: theme.transitions.easing.sharp, - duration: theme.transitions.duration.leavingScreen - }) - }, - appBarShift: { - width: `calc(100% - ${drawerWidth}px)`, - marginLeft: drawerWidth, - transition: theme.transitions.create(['margin', 'width'], { - easing: theme.transitions.easing.easeOut, - duration: theme.transitions.duration.enteringScreen - }) - }, - menuButton: { - marginRight: theme.spacing(2) - }, - hide: { - display: 'none' - }, - drawer: { - width: drawerWidth, - flexShrink: 0 - }, - drawerPaper: { - width: drawerWidth - }, - drawerHeader: { - display: 'flex', - alignItems: 'center', - padding: theme.spacing(0, 1), - ...theme.mixins.toolbar, - justifyContent: 'flex-end' - }, - content: { - flexGrow: 1, - transition: theme.transitions.create('margin', { - easing: theme.transitions.easing.sharp, - duration: theme.transitions.duration.leavingScreen - }), - maxWidth: '100%', - backgroundColor: theme.palette.background.paper, - marginLeft: -drawerWidth, - minHeight: '100vh' - }, - contentShift: { - transition: theme.transitions.create('margin', { - easing: theme.transitions.easing.easeOut, - duration: theme.transitions.duration.enteringScreen - }), - maxWidth: '100%', - backgroundColor: theme.palette.background.paper, - marginLeft: 0, - minHeight: '100vh' - }, - scrollToTop: { - position: 'fixed', - bottom: theme.spacing(2), - right: theme.spacing(2) - } - }) -); - -const ScrollTop: React.FC = (props) => { - const { children } = props; - const classes = useStyles(); - const trigger = useScrollTrigger({ - disableHysteresis: true, - threshold: 100 - }); - - const handleClick = (event: React.MouseEvent) => { - const anchor = ((event.target as HTMLDivElement).ownerDocument || document).querySelector('#back-to-top-anchor'); - - if (anchor) { - anchor.scrollIntoView({ behavior: 'smooth', block: 'center' }); - } - }; - - return ( - -
- {children} -
-
- ); -}; - -export const Navigation: React.FC = (props) => { - const classes = useStyles(); - const [open, setOpen] = React.useState(false); - - const handleDrawerOpen = () => { - setOpen(true); - }; - - const handleDrawerClose = () => { - setOpen(false); - }; - - return ( -
- - - - - - - - {props.title} - - - - -
- - - -
- - - - {props.menus.map((menu) => ( - - - - - - - ))} - -
- -
- - {props.children} -
- - - - - -
- ); -}; diff --git a/themes/theme-react/src/components/ViewsShow.tsx b/themes/theme-react/src/components/ViewsShow.tsx deleted file mode 100644 index 3a7beb179..000000000 --- a/themes/theme-react/src/components/ViewsShow.tsx +++ /dev/null @@ -1,28 +0,0 @@ -import React from 'react'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; - -export interface ViewsShowProps { - vkey: string; - position?: string; -} - -const VIEWS_SHOW_API_PREFIX = `https://views.show/svg?key=`; -const VIEWS_SHOW_DEFAULT_POSITION = `x=0%25&y=10%25`; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - margin: theme.spacing(1, 0.5), - height: 30 - } - }) -); - -export const ViewsShow: React.FC = (props) => { - const classes = useStyles(); - return ( -
- views show -
- ); -}; diff --git a/themes/theme-react/src/constants.ts b/themes/theme-react/src/constants.ts deleted file mode 100644 index 512967e41..000000000 --- a/themes/theme-react/src/constants.ts +++ /dev/null @@ -1,43 +0,0 @@ -/** - * constants for theme styling - **/ -import createMuiTheme from '@material-ui/core/styles/createMuiTheme'; - -export const THEME_FONT_FAMILY = - 'Roboto, XHei, -apple-system, BlinkMacSystemFont, PingFang SC, Hiragino Sans GB, Microsoft YaHei,\n' + - ' WenQuanYi Micro Hei, Segoe UI, Helvetica, Arial, sans-serif, Apple Color Emoji, Segoe UI Emoji, Segoe UI Symbol'; -export const DRAWER_WIDTH = 240; -export const CARD_MAX_WIDTH = 800; - -export const TYPE_JSON_LD = 'application/ld+json'; - -export const theme = createMuiTheme({ - palette: { - primary: { - light: '#FFFFFF', - main: '#FFFFFF', - dark: '#C2C2C2', - contrastText: '#000000' - }, - background: { - paper: '#FFFFFF', - default: '#F5F5F5' - } - }, - overrides: { - MuiTypography: { - root: { - fontFamily: THEME_FONT_FAMILY - }, - body1: { - fontFamily: THEME_FONT_FAMILY - }, - body2: { - fontFamily: THEME_FONT_FAMILY - }, - h5: { - fontFamily: THEME_FONT_FAMILY - } - } - } -}); diff --git a/themes/theme-react/src/favicon.png b/themes/theme-react/src/favicon.png index 18c728d9d..d356fd965 100644 Binary files a/themes/theme-react/src/favicon.png and b/themes/theme-react/src/favicon.png differ diff --git a/themes/theme-react/src/index.html b/themes/theme-react/src/index.html deleted file mode 100644 index 0b8cbb176..000000000 --- a/themes/theme-react/src/index.html +++ /dev/null @@ -1,35 +0,0 @@ - - - - - - - - - - - - - - -
- - diff --git a/themes/theme-react/src/main.tsx b/themes/theme-react/src/main.tsx deleted file mode 100644 index 4a72bcf51..000000000 --- a/themes/theme-react/src/main.tsx +++ /dev/null @@ -1,11 +0,0 @@ -import './index.css'; -import './offline'; -import * as React from 'react'; -import * as ReactDOM from 'react-dom'; -import { App } from '@theme-react/App'; - -if (process.env.NODE_ENV === 'production') { - ReactDOM.hydrate(, document.getElementById('app')); -} else { - ReactDOM.render(, document.getElementById('app')); -} diff --git a/themes/theme-react/src/markdown.css b/themes/theme-react/src/markdown.css deleted file mode 100644 index 08fd44540..000000000 --- a/themes/theme-react/src/markdown.css +++ /dev/null @@ -1,13 +0,0 @@ -@import '~github-markdown-css'; -@import '~highlight.js/styles/a11y-dark.css'; - -/* manual set black background*/ -.markdown-body .highlight pre, -.markdown-body pre { - background-color: #2b2b2b; -} - -.markdown-body > * { - font-family: Roboto, XHei, -apple-system, BlinkMacSystemFont, PingFang SC, Hiragino Sans GB, Microsoft YaHei, - WenQuanYi Micro Hei, Segoe UI, Helvetica, Arial, sans-serif, Apple Color Emoji, Segoe UI Emoji, Segoe UI Symbol; -} diff --git a/themes/theme-react/src/offline.ts b/themes/theme-react/src/offline.ts deleted file mode 100644 index 9c0edddc9..000000000 --- a/themes/theme-react/src/offline.ts +++ /dev/null @@ -1,13 +0,0 @@ -if (process.env.NODE_ENV === 'production') { - const analyticsTracking = document.getElementById('google-analytics'); - if (analyticsTracking) { - window['ga'] = - window['ga'] || - function () { - (window['ga'].q = window['ga'].q || []).push(arguments); - }; - window['ga'].l = +new Date(); - window['ga']('create', analyticsTracking.getAttribute('content'), 'auto'); - window['ga']('send', 'pageview'); - } -} diff --git a/themes/theme-react/src/router.tsx b/themes/theme-react/src/router.tsx deleted file mode 100644 index fd58fb954..000000000 --- a/themes/theme-react/src/router.tsx +++ /dev/null @@ -1,104 +0,0 @@ -import loadable from '@loadable/component'; -import * as React from 'react'; -import { useEffect } from 'react'; -import { Route, Switch, useLocation } from 'react-router-dom'; -import { buildURLPath } from '@blog/common/utils/path.util'; -import { RoutePathPrefix } from '@blog/common/interfaces/routes'; -import List from '@theme-react/views/List'; -import Table from '@theme-react/views/Table'; - -// I must use Angular Style Routing for stupid react router philosophy -export const routes = [ - { - path: '/', - exact: true, - component: List, - apiPath: () => buildURLPath(RoutePathPrefix.HOME_ALIAS) - }, - { - path: '/posts/:year/:month/:date/:id', - exact: true, - component: loadable(() => - import( - /* webpackChunkName: "detail" */ - './views/Detail' - ) - ), - apiPath: (match) => - buildURLPath( - RoutePathPrefix.POSTS, - match.params['year'], - match.params['month'], - match.params['date'], - match.params['id'] - ) - }, - { - path: '/pages/:id', - exact: true, - component: loadable(() => - import( - /* webpackChunkName: "detail" */ - './views/Detail' - ) - ), - apiPath: (match) => buildURLPath(RoutePathPrefix.PAGES, match.params['id']) - }, - { - path: '/posts', - exact: true, - component: List, - apiPath: () => buildURLPath(RoutePathPrefix.POSTS) - }, - { - path: '/categories/:category', - exact: true, - component: List, - apiPath: (match) => buildURLPath(RoutePathPrefix.CATEGORIES, match.params['category']) - }, - { - path: '/tags/:tag', - exact: true, - component: List, - apiPath: (match) => buildURLPath(RoutePathPrefix.TAGS, match.params['tag']) - }, - { - path: '/categories/', - exact: true, - component: Table, - apiPath: () => buildURLPath(RoutePathPrefix.CATEGORIES) - }, - { - path: '/tags', - exact: true, - component: Table, - apiPath: () => buildURLPath(RoutePathPrefix.TAGS) - } -]; - -const usePageViews = () => { - const location = useLocation(); - - useEffect(() => { - if (window['ga']) { - window['ga']('set', 'page', location.pathname); - window['ga']('send', 'pageview', location.pathname); - } - }, [location]); -}; - -export const RouterView: React.FC = () => { - usePageViews(); - return ( - - {routes.map((route, i) => ( - } - /> - ))} - - ); -}; diff --git a/themes/theme-react/src/views/Detail.tsx b/themes/theme-react/src/views/Detail.tsx deleted file mode 100644 index cc88013e8..000000000 --- a/themes/theme-react/src/views/Detail.tsx +++ /dev/null @@ -1,86 +0,0 @@ -import * as React from 'react'; -import { useEffect, useState } from 'react'; -import { Helmet } from 'react-helmet'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import { ApiPathProps, loadApi } from '@theme-react/api'; -import { EmptyRouteMeta, Meta, RouteMeta } from '@blog/common/interfaces/routes'; -import { BreadcrumbList } from '@theme-react/components/BreadcrumbList'; -import { ArticleDetail } from '@theme-react/components/ArticleDetail'; -import { ContentItems } from '@theme-react/components/ContentItems'; -import { DRAWER_WIDTH, TYPE_JSON_LD } from '@theme-react/constants'; -import { ContentItem } from '@blog/common/interfaces/articles/content-item'; -import Container from '@material-ui/core/Container'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - display: 'flex', - flexWrap: 'wrap', - justifyContent: 'space-around', - overflow: 'hidden', - backgroundColor: theme.palette.background.paper, - paddingTop: theme.spacing(1), - [theme.breakpoints.down('sm')]: { - padding: theme.spacing(1) - } - }, - toc: { - display: 'flex', - flexWrap: 'wrap', - justifyContent: 'flex-end', - overflow: 'hidden', - backgroundColor: theme.palette.background.paper, - [theme.breakpoints.up('md')]: { - width: DRAWER_WIDTH - } - }, - content: { - padding: 0, - margin: 0, - [theme.breakpoints.down('sm')]: { - display: 'flex', - flexWrap: 'wrap', - overflow: 'hidden' - } - } - }) -); - -export const Detail: React.FC = (props) => { - const classes = useStyles(); - const [routeMeta, setRouteMeta] = useState(EmptyRouteMeta); - const [metas, setMetas] = useState([]); - const [contentItems, setContentItems] = useState([]); - - const loadData = async () => { - const routeMeta = await loadApi(props.apiPath); - setRouteMeta(routeMeta); - setMetas(routeMeta.metas); - setContentItems(routeMeta.data.toc); - }; - - useEffect(() => { - loadData(); - }, [props.apiPath]); - - return ( - - - {routeMeta.title} - - {metas.map((meta, index) => ( - - ))} - -
- - -
-
- -
-
- ); -}; - -export default Detail; diff --git a/themes/theme-react/src/views/List.tsx b/themes/theme-react/src/views/List.tsx deleted file mode 100644 index 829fbba89..000000000 --- a/themes/theme-react/src/views/List.tsx +++ /dev/null @@ -1,69 +0,0 @@ -import * as React from 'react'; -import { useEffect, useState } from 'react'; -import { Helmet } from 'react-helmet'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import { EmptyRouteMeta, Meta, RouteMeta, RoutePathPrefix } from '@blog/common/interfaces/routes'; -import { ApiPathProps, loadApi } from '@theme-react/api'; -import { ArticleCard } from '@theme-react/components/ArticleCard'; -import { ArticleContext } from '@blog/common/interfaces/articles/article-context'; -import { BreadcrumbList } from '@theme-react/components/BreadcrumbList'; -import { TYPE_JSON_LD } from '@theme-react/constants'; -import { buildURLPath } from '@blog/common/utils/path.util'; -import Container from '@material-ui/core/Container'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - display: 'flex', - flexWrap: 'wrap', - justifyContent: 'space-around', - overflow: 'hidden', - backgroundColor: theme.palette.background.paper, - paddingTop: theme.spacing(1), - [theme.breakpoints.down('sm')]: { - padding: theme.spacing(1) - } - } - }) -); - -export const List: React.FC = (props) => { - const classes = useStyles(); - const [routeMeta, setRouteMeta] = useState(EmptyRouteMeta); - const [metas, setMetas] = useState([]); - - const [articles, setArticles] = useState[]>([]); - const [showBreadcrumbs, setShowBreadcrumbs] = useState(true); - - const loadData = async () => { - const routeMeta = await loadApi(props.apiPath); - setRouteMeta(routeMeta); - setArticles(routeMeta.data); - setMetas(routeMeta.metas); - }; - - useEffect(() => { - loadData(); - setShowBreadcrumbs(!(props.apiPath === buildURLPath(RoutePathPrefix.HOME_ALIAS))); - }, [props.apiPath]); - - return ( - - - {routeMeta.title} - - {metas.map((meta, index) => ( - - ))} - - - {showBreadcrumbs && } - - {articles.map((article) => ( - - ))} - - ); -}; - -export default List; diff --git a/themes/theme-react/src/views/Table.tsx b/themes/theme-react/src/views/Table.tsx deleted file mode 100644 index 2614d7990..000000000 --- a/themes/theme-react/src/views/Table.tsx +++ /dev/null @@ -1,71 +0,0 @@ -import * as React from 'react'; -import { useEffect, useState } from 'react'; -import { Helmet } from 'react-helmet'; -import { createStyles, makeStyles, Theme } from '@material-ui/core/styles'; -import { EmptyRouteMeta, Meta, RouteMeta } from '@blog/common/interfaces/routes'; -import { ApiPathProps, loadApi } from '@theme-react/api'; -import { TYPE_JSON_LD } from '@theme-react/constants'; -import { BreadcrumbList } from '@theme-react/components/BreadcrumbList'; -import { CollectionCard } from '@theme-react/components/CollectionCard'; -import Container from '@material-ui/core/Container'; - -const useStyles = makeStyles((theme: Theme) => - createStyles({ - root: { - display: 'flex', - flexWrap: 'wrap', - justifyContent: 'space-around', - overflow: 'hidden', - backgroundColor: theme.palette.background.paper, - paddingTop: theme.spacing(1), - [theme.breakpoints.down('sm')]: { - padding: theme.spacing(1) - } - }, - content: { - padding: 0, - margin: 0, - display: 'flex', - flexWrap: 'wrap', - overflow: 'hidden' - } - }) -); - -export const Table: React.FC = (props) => { - const classes = useStyles(); - const [routeMeta, setRouteMeta] = useState(EmptyRouteMeta); - const [metas, setMetas] = useState([]); - const [collections, setCollections] = useState([]); - - const loadData = async () => { - const routeMeta = await loadApi(props.apiPath); - setRouteMeta(routeMeta); - setMetas(routeMeta.metas); - setCollections(routeMeta.data); - }; - - useEffect(() => { - loadData(); - }, [props.apiPath]); - - return ( - - - {routeMeta.title} - - {metas.map((meta, index) => ( - - ))} - - - - - {collections.map((collection, index) => ( - - ))} - - ); -}; - -export default Table; diff --git a/themes/theme-react/webpack/webpack.dev.ts b/themes/theme-react/webpack/webpack.dev.ts deleted file mode 100644 index 7001520d6..000000000 --- a/themes/theme-react/webpack/webpack.dev.ts +++ /dev/null @@ -1,82 +0,0 @@ -import merge from 'webpack-merge'; -import HtmlWebpackPlugin from 'html-webpack-plugin'; -import FriendlyErrorsPlugin from 'friendly-errors-webpack-plugin'; -import MiniCssExtractPlugin from 'mini-css-extract-plugin'; -import { CleanWebpackPlugin } from 'clean-webpack-plugin'; -import { resolve } from './path.util'; -import { BASE_DIR, DIST_DIR, webpackBaseConfig } from './webpack.base'; - -const PROTOCOL = 'http://'; -const LOCAL_HOST = 'localhost'; -const LOCAL_PORT = 8080; -export const LOCAL_URL = `${PROTOCOL}${LOCAL_HOST}:${LOCAL_PORT}/`; - -export const webpackDevConfig = merge(webpackBaseConfig, { - mode: 'development', - output: { - path: resolve(`build`), - publicPath: LOCAL_URL, - filename: '[name].bundle.js' - }, - module: { - rules: [ - { - test: /\.less$/, - include: [BASE_DIR], - use: [ - { - loader: MiniCssExtractPlugin.loader, - options: { - hmr: true - } - }, - 'style-loader', - 'css-loader', - 'postcss-loader', - 'less-loader' - ] - }, - { - test: /\.css$/, - include: [BASE_DIR], - use: ['style-loader', 'css-loader', 'postcss-loader'] - } - ] - }, - plugins: [ - new CleanWebpackPlugin(), - new MiniCssExtractPlugin({ - filename: '[name].bundle.css', - chunkFilename: '[id].bundle.css', - ignoreOrder: false - }), - new HtmlWebpackPlugin({ - template: `${BASE_DIR}/index.html`, - favicon: `${BASE_DIR}/favicon.png` - }), - new FriendlyErrorsPlugin() - ], - devServer: { - host: LOCAL_HOST, - port: LOCAL_PORT, - historyApiFallback: true, - quiet: false, - noInfo: true, - stats: { - cached: false, - assets: false, - colors: true, - version: false, - hash: false, - children: false, - timings: true, - chunks: true, - chunkModules: false - }, - publicPath: LOCAL_URL, - contentBase: DIST_DIR, - hot: true - } -}); - -export default webpackDevConfig; diff --git a/themes/theme-react/webpack/webpack.prod.ts b/themes/theme-react/webpack/webpack.prod.ts deleted file mode 100644 index e1e411e9e..000000000 --- a/themes/theme-react/webpack/webpack.prod.ts +++ /dev/null @@ -1,121 +0,0 @@ -import merge from 'webpack-merge'; - -import OptimizeCssAssetsPlugin from 'optimize-css-assets-webpack-plugin'; -import TerserJSPlugin from 'terser-webpack-plugin'; -import UglifyJsPlugin from 'uglifyjs-webpack-plugin'; -import MiniCssExtractPlugin from 'mini-css-extract-plugin'; -import HtmlWebpackPlugin from 'html-webpack-plugin'; -import FaviconsWebpackPlugin from 'favicons-webpack-plugin'; -import PreloadPlugin from 'preload-webpack-plugin'; -import RobotsTxtPlugin from 'robotstxt-webpack-plugin'; - -import { resolve } from './path.util'; -import { BASE_DIR, BASE_TITLE, webpackBaseConfig } from './webpack.base'; - -const THEME_DIST_DIR = resolve(`dist`); -const NODE_MODULES = resolve(`node_modules`); - -export const webpackProdConfig = merge(webpackBaseConfig, { - mode: 'production', - devtool: 'source-map', - output: { - path: THEME_DIST_DIR, - filename: `static/js/[name].[chunkhash].js`, - chunkFilename: `static/js/[name].[chunkhash].js`, - publicPath: '/' - }, - module: { - rules: [ - { - test: /\.less$/, - include: [BASE_DIR, NODE_MODULES], - use: [MiniCssExtractPlugin.loader, 'css-loader', 'postcss-loader', 'less-loader'] - }, - { - test: /\.css$/, - include: [BASE_DIR, NODE_MODULES], - use: [MiniCssExtractPlugin.loader, 'css-loader'] - } - ] - }, - plugins: [ - new TerserJSPlugin({}), - new MiniCssExtractPlugin({ - filename: `static/css/[name].[chunkhash].css`, - chunkFilename: `static/css/[name].[chunkhash].css` - }), - new HtmlWebpackPlugin({ - template: `${BASE_DIR}/index.html`, - favicon: `${BASE_DIR}/favicon.png`, - inject: true, - minify: { - removeComments: true, - collapseWhitespace: true, - removeAttributeQuotes: false - } - }), - new FaviconsWebpackPlugin({ - prefix: `static/img`, - outputPath: `static/img`, - logo: BASE_DIR + `/favicon.png`, - cache: true, - inject: true, - favicons: { - start_url: '/', - appName: BASE_TITLE, - appShortName: BASE_TITLE, - appDescription: ``, - theme_color: `#FFFFFF`, - background: `#FFFFFF`, - icons: { - android: true, - appleIcon: true, - appleStartup: true, - firefox: true, - windows: true - } - } - }), - new PreloadPlugin({ - rel: 'preload', - include: 'allChunks' - }), - new RobotsTxtPlugin() - ], - optimization: { - splitChunks: { - cacheGroups: { - vendors: { - name: 'vendors', - test: /[\\\/]node_modules[\\\/]/, - priority: -10, - chunks: 'initial' - }, - common: { - name: 'common', - minChunks: 2, - priority: -20, - chunks: 'initial', - reuseExistingChunk: true - } - } - }, - minimizer: [ - new UglifyJsPlugin({ - cache: false, - parallel: true, - sourceMap: true - }), - new OptimizeCssAssetsPlugin({ - cssProcessorOptions: { - safe: true, - discardComments: { - removeAll: true - } - }, - canPrint: true - }) - ] - }, - stats: 'minimal' -}); diff --git a/themes/theme-vue/.gitignore b/themes/theme-vue/.gitignore deleted file mode 100644 index 1efcd23bc..000000000 --- a/themes/theme-vue/.gitignore +++ /dev/null @@ -1,4 +0,0 @@ -# vue-cli -vue.config.js -webpack.config.js -stats.json diff --git a/themes/theme-vue/package.json b/themes/theme-vue/package.json deleted file mode 100644 index 42d222054..000000000 --- a/themes/theme-vue/package.json +++ /dev/null @@ -1,90 +0,0 @@ -{ - "name": "@blog/theme-vue", - "version": "6.26.198", - "private": true, - "scripts": { - "clean": "rimraf dist", - "test": "vue-cli-service test:unit", - "prebuild:theme": "yarn run compile:config", - "build:theme": "vue-cli-service build", - "preserve": "yarn run compile:config", - "serve": "vue-cli-service serve", - "compile:config": "tsc vue.config.ts --resolveJsonModule", - "debug:webpack": "vue inspect | tee > webpack.config.js" - }, - "dependencies": { - "@blog/common": "^6.26.198", - "@blog/config": "^6.26.198", - "@blog/router": "^6.26.198", - "@mdi/font": "5.9.55", - "axios": "0.21.1", - "date-fns": "2.19.0", - "highlight.js": "10.6.0", - "vue": "2.6.12", - "vue-class-component": "7.2.6", - "vue-disqus": "4.0.1", - "vue-material": "1.0.0-beta-11", - "vue-meta": "2.4.0", - "vue-property-decorator": "9.1.2", - "vue-router": "3.5.1" - }, - "devDependencies": { - "@types/jest": "26.0.20", - "@types/node": "13.13.45", - "@vue/cli-plugin-pwa": "4.5.11", - "@vue/cli-plugin-typescript": "4.5.11", - "@vue/cli-plugin-unit-jest": "4.5.11", - "@vue/cli-service": "4.5.11", - "@vue/test-utils": "1.1.3", - "copy-webpack-plugin": "8.0.0", - "less": "4.1.1", - "less-loader": "7.3.0", - "node-sass": "5.0.0", - "rimraf": "3.0.2", - "sass-loader": "10.1.1", - "stats-webpack-plugin": "0.7.0", - "ts-jest": "26.5.3", - "typescript": "4.0.5", - "vue-template-compiler": "2.6.12" - }, - "postcss": { - "plugins": { - "autoprefixer": {} - } - }, - "browserslist": [ - "> 1%", - "last 2 versions" - ], - "jest": { - "moduleFileExtensions": [ - "js", - "jsx", - "json", - "vue", - "ts", - "tsx" - ], - "moduleNameMapper": { - "^@theme-vue/(.*)$": "/src/$1" - }, - "transform": { - "^.+\\.vue$": "vue-jest", - "^.+\\.tsx?$": "ts-jest", - ".+\\.(css|styl|less|sass|scss|svg|png|jpg|ttf|woff|woff2)$": "jest-transform-stub" - }, - "transformIgnorePatterns": [ - "/node_modules/" - ], - "snapshotSerializers": [ - "jest-serializer-vue" - ], - "testMatch": [ - "**/tests/unit/**/*.spec.(js|jsx|ts|tsx)|**/__tests__/*.(js|jsx|ts|tsx)" - ], - "watchPlugins": [ - "jest-watch-typeahead/filename", - "jest-watch-typeahead/testname" - ] - } -} diff --git a/themes/theme-vue/vue.config.ts b/themes/theme-vue/vue.config.ts deleted file mode 100644 index e4b166ebf..000000000 --- a/themes/theme-vue/vue.config.ts +++ /dev/null @@ -1,42 +0,0 @@ -import * as path from 'path'; - -import { loadConfig } from '@blog/config'; - -const config = loadConfig(); - -const BASE_DIR = path.join(__dirname, 'src'); -const DIST_DIR = config.dirs.dest; - -module.exports = { - configureWebpack: { - resolve: { - alias: { - '@theme-vue': BASE_DIR - } - }, - stats: 'minimal' - }, - chainWebpack: (conf) => { - conf.plugin('html').tap((args) => { - args[0].minify = { - removeComments: true, - collapseWhitespace: true, - removeAttributeQuotes: false - }; - return args; - }); - }, - devServer: { - contentBase: DIST_DIR, - hot: true - }, - pwa: { - name: config.site.baseTitle, - manifestOptions: { - name: config.site.baseTitle, - short_name: config.site.baseTitle, - start_url: '/' - }, - themeColor: '#1A73E8' - } -};