Skip to content

Commit

Permalink
feat: upgrade flux to v0.188.0 (#23911)
Browse files Browse the repository at this point in the history
* feat: upgrade flux to 0.171.0

Tests failing, safety commit

First step in #23815

* fix: remove "org" parameter" from writeOptSource

I attempted to implement the "orgOpt" argument in a similar fashion
to f6669f7. However, it looks like Flux doesn't accept "org" as
a parameter to "load". It responds with:

Error calling function \"load\" @113:16-113:30: error calling function \"to\" @6:19-6:47: unused arguments [org]

This brings us from 194 passing to 570 passing.

* fix: temporarily disable broken flux tests

These tests expect rows to be stored in a certain order. However,
nothing is specifying the sort order. This has been fixed in a
later update to flux: (see 3d6f47ded).

Temporarily disable these tests until we include a fixed
version of the flux tests.

* chore: add tests from a492993

This fixes "test-flux.sh" so it runs tests within the "flux/"
directory. This uncovered some other issues with the tests
located within "flux/". These also needed to be updated
to match the newer flux API.

* feat: upgrade flux to 0.172.0

This includes changes made in "cbbf4b27da". Since "test.go" in 2.x
diverged from 1.x, some modifications were required to make this
compatible.

* feat: upgrade flux to 0.173.0

* feat: upgrade flux to v0.174.0

* fix: Update the condition when reseting cursor (#23522)

Filters that contain `or` may change between cursor resets so we must remember to update the condition in the read cursor.

```flux
|> filter(fn: (r) => ((r["_field"] == "field1" and r["_value"]==true) or (r["_field"] == "field2" and r["_value"] == false)))
```

Closes influxdata/flux#4804

* feat: upgrade flux to 0.174.1

* feat: upgrade flux to 0.175.0

* chore: remove end-to-end tests

These were removed in a492993 for 2.x. These tests prevent "go test ./..."
from completing. As stated in the original commit, these tests should now be
handled by the "fluxtest" harness.

* feat: upgrade flux to 0.176.0

Some tests needed to be disabled within the flux harness. This is a
result of enabling "Optimize Aggregate Window" in flux@05a1065f.
These tests are not present in 2.x. Therefore, I am unsure if
the breakage is resolved in a later commit.

* feat: upgrade flux to 0.177.0

* feat: upgrade flux to 0.178.0

* feat: upgrade flux to v0.179.0

This removes all invocations of "flux.RegisterOpSpec". According
to flux@e39096d5, "flux.RegisterOpSpec" does nothing in the
current version of flux and was removed.

* chore: update fluxtest skip list (#23633)

* chore: manually backport 785a465

This removes the reference to "flux.Spec".

* build(flux): update flux to v0.181.0 (#23682)

* build(flux): update flux to v0.184.2

* chore: skip more Flux acceptance tests

There are issues for each skip detailed in test-flux.sh.

* feat: upgrade flux to v0.185.0

This adds "FluxTesting" to the "HTTPD" configuration. This option is
hidden and disabled by default. When "FluxTesting" is set, it
enables the default testing flags for "Flux".

These flags allow the "vectorized float tests" and tests requiring
the "removeRedundantSortNodes" and "labelPolymorphism" flag
enabled to work. These changes are based off of d8553c0.

flux@3d6f47ded is included within this version of Flux. Therefore
we can now include the "group_*" tests.

* feat: upgrade flux to 0.186.0

* feat: upgrade flux to 0.187.0

* feat: upgrade flux to 0.188.0

* fix: re-run ./generate.sh with updated protoc

* fix: restrict cores to match CircleCI documentation

Co-authored-by: davidby-influx <dbyrne@influxdata.com>
Co-authored-by: Markus Westerlind <marwes91@gmail.com>
Co-authored-by: Sean Brickley <sean@wabr.io>
Co-authored-by: Jonathan A. Sternberg <jonathan@influxdata.com>
Co-authored-by: Christopher M. Wolff <chris.wolff@influxdata.com>
  • Loading branch information
6 people committed Nov 15, 2022
1 parent be9a3d4 commit 5976e41
Show file tree
Hide file tree
Showing 31 changed files with 727 additions and 538 deletions.
14 changes: 12 additions & 2 deletions .circleci/config.yml
Expand Up @@ -244,7 +244,7 @@ jobs:
unit_test_race:
docker:
- image: quay.io/influxdb/cross-builder:<< pipeline.parameters.cross-container-tag >>
resource_class: large
resource_class: xlarge
steps:
- checkout
- restore_cache:
Expand All @@ -257,7 +257,17 @@ jobs:
set -x
mkdir -p junit-race/
export GORACE="halt_on_error=1"
gotestsum --junitfile junit-race/influxdb.junit.xml -- -race ./...
# "resource_class: xlarge" creates a Docker container with eight
# virtual cpu cores. However, applications like "nproc" return
# the host machine's core count (which in this case is 36).
# When less cores are available than advertised, the
# race-tests fail.
#
# We'll manually reduce the number of available cores to what
# is specified by the CircleCI documentation:
# https://circleci.com/product/features/resource-classes/
taskset -c 0-7 \
gotestsum --junitfile junit-race/influxdb.junit.xml -- -race ./...
no_output_timeout: 1500s
- store_test_results:
path: junit-race/
Expand Down
2 changes: 1 addition & 1 deletion cmd/influx_tools/internal/format/binary/tools_binary.pb.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion cmd/influxd/backup_util/internal/backup_util.pb.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

9 changes: 8 additions & 1 deletion cmd/influxd/run/server.go
Expand Up @@ -15,6 +15,7 @@ import (

"github.com/influxdata/flux"
"github.com/influxdata/flux/dependencies/testing"
"github.com/influxdata/flux/execute/executetest"
"github.com/influxdata/influxdb"
"github.com/influxdata/influxdb/coordinator"
influxdb2 "github.com/influxdata/influxdb/flux/stdlib/influxdata/influxdb"
Expand Down Expand Up @@ -312,10 +313,16 @@ func (s *Server) appendHTTPDService(c httpd.Config) error {
if err != nil {
return err
}

deps := []flux.Dependency{storageDep, testing.FrameworkConfig{}}
if s.config.HTTPD.FluxTesting {
deps = append(deps, executetest.NewDefaultTestFlagger())
}

srv.Handler.Controller, err = control.New(
s.config.FluxController,
s.Logger.With(zap.String("service", "flux-controller")),
[]flux.Dependency{storageDep, testing.FrameworkConfig{}},
deps,
)
if err != nil {
return err
Expand Down
2 changes: 1 addition & 1 deletion flux/stdlib/influxdata/influxdb/buckets.go
Expand Up @@ -149,7 +149,7 @@ func (rule LocalBucketsRule) Name() string {
}

func (rule LocalBucketsRule) Pattern() plan.Pattern {
return plan.Pat(influxdb.BucketsKind)
return plan.MultiSuccessor(influxdb.BucketsKind)
}

func (rule LocalBucketsRule) Rewrite(ctx context.Context, node plan.Node) (plan.Node, bool, error) {
Expand Down
37 changes: 36 additions & 1 deletion flux/stdlib/influxdata/influxdb/filter_test.flux
Expand Up @@ -33,9 +33,44 @@ testcase filter {
,,0,2018-05-22T19:53:36Z,system,host.local,load1,1.63
")

got = testing.loadStorage(csv: input)
got = csv.from(csv: input)
|> testing.load()
|> range(start: -100y)
|> filter(fn: (r) => r._measurement == "system" and r._field == "load1")
|> drop(columns: ["_start", "_stop"])
testing.diff(want, got)
}


input_issue_4804 = "#datatype,string,long,dateTime:RFC3339,string,string,string,boolean
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,true
,,0,2018-05-22T19:53:36Z,system,host.local,load1,false
,,1,2018-05-22T19:53:26Z,system,host.local,load3,false
,,2,2018-05-22T19:53:26Z,system,host.local,load4,true
"

testcase flux_issue_4804 {
expect.planner(rules: [
"influxdata/influxdb.FromStorageRule": 1,
"PushDownRangeRule": 1,
"PushDownFilterRule": 1,
])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,boolean
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,true
,,1,2018-05-22T19:53:26Z,system,host.local,load3,false
")

got = csv.from(csv: input_issue_4804)
|> testing.load()
|> range(start: -100y)
|> filter(fn: (r) => ((r["_field"] == "load1" and r["_value"] == true) or (r["_field"] == "load3" and r["_value"] == false)))
|> drop(columns: ["_start", "_stop"])
testing.diff(want, got)
}
267 changes: 267 additions & 0 deletions flux/stdlib/influxdata/influxdb/multi_measure_test.flux
@@ -0,0 +1,267 @@
package influxdb_test

import "csv"
import "testing"

option now = () => 2030-01-01T00:00:00Z

input = "
#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,1.83
,,0,2018-05-22T19:53:36Z,system,host.local,load1,1.72
,,0,2018-05-22T19:53:46Z,system,host.local,load1,1.74
,,0,2018-05-22T19:53:56Z,system,host.local,load1,1.63
,,0,2018-05-22T19:54:06Z,system,host.local,load1,1.91
,,0,2018-05-22T19:54:16Z,system,host.local,load1,1.84

,,1,2018-05-22T19:53:26Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:53:36Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:46Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:56Z,sys,host.local,load3,1.96
,,1,2018-05-22T19:54:06Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:54:16Z,sys,host.local,load3,1.97

,,2,2018-05-22T19:53:26Z,system,host.local,load5,1.95
,,2,2018-05-22T19:53:36Z,system,host.local,load5,1.92
,,2,2018-05-22T19:53:46Z,system,host.local,load5,1.92
,,2,2018-05-22T19:53:56Z,system,host.local,load5,1.89
,,2,2018-05-22T19:54:06Z,system,host.local,load5,1.94
,,2,2018-05-22T19:54:16Z,system,host.local,load5,1.93

,,3,2018-05-22T19:53:26Z,var,host.local,load3,91.98
,,3,2018-05-22T19:53:36Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:46Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:56Z,var,host.local,load3,91.96
,,3,2018-05-22T19:54:06Z,var,host.local,load3,91.98
,,3,2018-05-22T19:54:16Z,var,host.local,load3,91.97

,,4,2018-05-22T19:53:26Z,swap,host.global,used_percent,82.98
,,4,2018-05-22T19:53:36Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:46Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:56Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:06Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:16Z,swap,host.global,used_percent,82.64

#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,loc,_field,_value
,,0,2018-05-22T19:53:26Z,locale,en,lat,37.09
,,0,2018-05-22T19:53:36Z,locale,en,lat,37.10
,,0,2018-05-22T19:53:46Z,locale,en,lat,37.08
"

testcase multi_measure {
got = csv.from(csv: input)
|> testing.load()
|> range(start: 2018-01-01T00:00:00Z, stop: 2019-01-01T00:00:00Z)
|> filter(fn: (r) => r["_measurement"] == "system" or r["_measurement"] == "sys")
|> filter(fn: (r) => r["_field"] == "load1" or r["_field"] == "load3")
|> drop(columns: ["_start", "_stop"])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,1.83
,,0,2018-05-22T19:53:36Z,system,host.local,load1,1.72
,,0,2018-05-22T19:53:46Z,system,host.local,load1,1.74
,,0,2018-05-22T19:53:56Z,system,host.local,load1,1.63
,,0,2018-05-22T19:54:06Z,system,host.local,load1,1.91
,,0,2018-05-22T19:54:16Z,system,host.local,load1,1.84
,,1,2018-05-22T19:53:26Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:53:36Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:46Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:56Z,sys,host.local,load3,1.96
,,1,2018-05-22T19:54:06Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:54:16Z,sys,host.local,load3,1.97
")

testing.diff(got, want)
}

testcase multi_measure_match_all {
got = csv.from(csv: input)
|> testing.load()
|> range(start: 2018-01-01T00:00:00Z, stop: 2019-01-01T00:00:00Z)
|> filter(fn: (r) => r["_measurement"] == "system" or r["_measurement"] == "sys" or r["_measurement"] == "var" or r["_measurement"] == "swap")
|> filter(fn: (r) => r["_field"] == "load1" or r["_field"] == "load3" or r["_field"] == "load5" or r["_field"] == "used_percent")
|> drop(columns: ["_start", "_stop"])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,1.83
,,0,2018-05-22T19:53:36Z,system,host.local,load1,1.72
,,0,2018-05-22T19:53:46Z,system,host.local,load1,1.74
,,0,2018-05-22T19:53:56Z,system,host.local,load1,1.63
,,0,2018-05-22T19:54:06Z,system,host.local,load1,1.91
,,0,2018-05-22T19:54:16Z,system,host.local,load1,1.84
,,1,2018-05-22T19:53:26Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:53:36Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:46Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:56Z,sys,host.local,load3,1.96
,,1,2018-05-22T19:54:06Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:54:16Z,sys,host.local,load3,1.97
,,2,2018-05-22T19:53:26Z,system,host.local,load5,1.95
,,2,2018-05-22T19:53:36Z,system,host.local,load5,1.92
,,2,2018-05-22T19:53:46Z,system,host.local,load5,1.92
,,2,2018-05-22T19:53:56Z,system,host.local,load5,1.89
,,2,2018-05-22T19:54:06Z,system,host.local,load5,1.94
,,2,2018-05-22T19:54:16Z,system,host.local,load5,1.93
,,3,2018-05-22T19:53:26Z,var,host.local,load3,91.98
,,3,2018-05-22T19:53:36Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:46Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:56Z,var,host.local,load3,91.96
,,3,2018-05-22T19:54:06Z,var,host.local,load3,91.98
,,3,2018-05-22T19:54:16Z,var,host.local,load3,91.97
,,4,2018-05-22T19:53:26Z,swap,host.global,used_percent,82.98
,,4,2018-05-22T19:53:36Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:46Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:56Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:06Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:16Z,swap,host.global,used_percent,82.64
")

testing.diff(got, want)
}

testcase multi_measure_tag_filter {
got = csv.from(csv: input)
|> testing.load()
|> range(start: 2018-01-01T00:00:00Z, stop: 2019-01-01T00:00:00Z)
|> filter(fn: (r) => r["_measurement"] == "system" or r["_measurement"] == "swap")
|> filter(fn: (r) => r["_field"] == "load1" or r["_field"] == "load3" or r["_field"] == "used_percent")
|> filter(fn: (r) => r["host"] == "host.local" or r["host"] == "host.global")
|> drop(columns: ["_start", "_stop"])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,1.83
,,0,2018-05-22T19:53:36Z,system,host.local,load1,1.72
,,0,2018-05-22T19:53:46Z,system,host.local,load1,1.74
,,0,2018-05-22T19:53:56Z,system,host.local,load1,1.63
,,0,2018-05-22T19:54:06Z,system,host.local,load1,1.91
,,0,2018-05-22T19:54:16Z,system,host.local,load1,1.84
,,4,2018-05-22T19:53:26Z,swap,host.global,used_percent,82.98
,,4,2018-05-22T19:53:36Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:46Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:56Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:06Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:16Z,swap,host.global,used_percent,82.64
")

testing.diff(got, want)
}

testcase multi_measure_complex_or {
got = csv.from(csv: input)
|> testing.load()
|> range(start: 2018-01-01T00:00:00Z, stop: 2019-01-01T00:00:00Z)
|> filter(fn: (r) => (r["_measurement"] == "system" or r["_measurement"] == "swap") or (r["_measurement"] != "var" and r["host"] == "host.local"))
|> drop(columns: ["_start", "_stop"])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,0,2018-05-22T19:53:26Z,system,host.local,load1,1.83
,,0,2018-05-22T19:53:36Z,system,host.local,load1,1.72
,,0,2018-05-22T19:53:46Z,system,host.local,load1,1.74
,,0,2018-05-22T19:53:56Z,system,host.local,load1,1.63
,,0,2018-05-22T19:54:06Z,system,host.local,load1,1.91
,,0,2018-05-22T19:54:16Z,system,host.local,load1,1.84
,,2,2018-05-22T19:53:26Z,system,host.local,load5,1.95
,,2,2018-05-22T19:53:36Z,system,host.local,load5,1.92
,,2,2018-05-22T19:53:46Z,system,host.local,load5,1.92
,,2,2018-05-22T19:53:56Z,system,host.local,load5,1.89
,,2,2018-05-22T19:54:06Z,system,host.local,load5,1.94
,,2,2018-05-22T19:54:16Z,system,host.local,load5,1.93
,,4,2018-05-22T19:53:26Z,swap,host.global,used_percent,82.98
,,4,2018-05-22T19:53:36Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:46Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:56Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:06Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:16Z,swap,host.global,used_percent,82.64
,,1,2018-05-22T19:53:26Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:53:36Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:46Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:56Z,sys,host.local,load3,1.96
,,1,2018-05-22T19:54:06Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:54:16Z,sys,host.local,load3,1.97
")

testing.diff(got, want)
}

testcase multi_measure_complex_and {
got = csv.from(csv: input)
|> testing.load()
|> range(start: 2018-01-01T00:00:00Z, stop: 2019-01-01T00:00:00Z)
|> filter(fn: (r) => r["_measurement"] != "system" or r["_measurement"] == "swap")
|> filter(fn: (r) => r["_measurement"] == "swap" or r["_measurement"] == "var")
|> drop(columns: ["_start", "_stop"])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,4,2018-05-22T19:53:26Z,swap,host.global,used_percent,82.98
,,4,2018-05-22T19:53:36Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:46Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:53:56Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:06Z,swap,host.global,used_percent,82.59
,,4,2018-05-22T19:54:16Z,swap,host.global,used_percent,82.64
,,3,2018-05-22T19:53:26Z,var,host.local,load3,91.98
,,3,2018-05-22T19:53:36Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:46Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:56Z,var,host.local,load3,91.96
,,3,2018-05-22T19:54:06Z,var,host.local,load3,91.98
,,3,2018-05-22T19:54:16Z,var,host.local,load3,91.97
")

testing.diff(got, want)
}

testcase multi_measure_negation {
got = csv.from(csv: input)
|> testing.load()
|> range(start: 2018-01-01T00:00:00Z, stop: 2019-01-01T00:00:00Z)
|> filter(fn: (r) => r["_measurement"] != "system")
|> filter(fn: (r) => r["host"] == "host.local" or not exists r["host"])
|> drop(columns: ["_start", "_stop"])

want = csv.from(csv: "#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,host,_field,_value
,,1,2018-05-22T19:53:26Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:53:36Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:46Z,sys,host.local,load3,1.97
,,1,2018-05-22T19:53:56Z,sys,host.local,load3,1.96
,,1,2018-05-22T19:54:06Z,sys,host.local,load3,1.98
,,1,2018-05-22T19:54:16Z,sys,host.local,load3,1.97
,,3,2018-05-22T19:53:26Z,var,host.local,load3,91.98
,,3,2018-05-22T19:53:36Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:46Z,var,host.local,load3,91.97
,,3,2018-05-22T19:53:56Z,var,host.local,load3,91.96
,,3,2018-05-22T19:54:06Z,var,host.local,load3,91.98
,,3,2018-05-22T19:54:16Z,var,host.local,load3,91.97

#datatype,string,long,dateTime:RFC3339,string,string,string,double
#group,false,false,false,true,true,true,false
#default,_result,,,,,,
,result,table,_time,_measurement,loc,_field,_value
,,0,2018-05-22T19:53:26Z,locale,en,lat,37.09
,,0,2018-05-22T19:53:36Z,locale,en,lat,37.10
,,0,2018-05-22T19:53:46Z,locale,en,lat,37.08
")

testing.diff(got, want)
}

0 comments on commit 5976e41

Please sign in to comment.