Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

As a Researcher I want to infer the home location from a trajectory #8

Open
2 tasks
noerw opened this issue Oct 19, 2020 · 0 comments
Open
2 tasks

Comments

@noerw
Copy link
Member

noerw commented Oct 19, 2020

  • specify semantics of sparse input trajectory data:

    • How do we differentiate between "time between points is time stayed at that point" and "time between points encodes missing spatial data"?
      I'm sure there already plenty of literature & code on this topic.
      A simple and (likely efficient) approach would be to annotate the first point of a new segment / recording session in the databaseare periods between points considered to mean "did not move", or NA, or do we apply some uncertainty metric?

      • Of the example trajectories collected, most consist of several segments (i.e. regular segments combining into a sparse trajectory), so it would make sense to store them as MultiLineStrings.
      • For the user generated trajectectory, location updates are temporally sparse all the time, so it's hard to make a cut between segments in order to interpret temporal gaps in the data.
    • ...?

  • conceptualize an inference engine architecture. ideas:

    • modular: plug in new inference matchers
    • classes of inference matchers, parametrized? for example
      homeInference = new MedianLocationInference({ filters: { timeofDay: [23, 6] } })
      workInference = new MedianLocationInference({ filters: { timeofDay: [10, 14], dayOfWeek: [0, 5] } })
    • streaming: inferences ideally should be able to be computed live on the stream of incoming points, to avoid recomputation on the whole trajectory for each point. This requires some form of serialization/persistence of the inference model state.
    • not relying on DB queries, for it to run in web environment as well? (can we deal with the performance?)
@noerw noerw created this issue from a note in Prototype v0 (To do) Oct 19, 2020
@noerw noerw moved this from To do to Next in Prototype v0 Nov 11, 2020
@zven zven moved this from Next to To do in Prototype v0 Feb 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

No branches or pull requests

1 participant