Skip to content

Latest commit

 

History

History
85 lines (60 loc) · 3.86 KB

distance-prediction-api-article.md

File metadata and controls

85 lines (60 loc) · 3.86 KB

Distance prediction API

Introduction

The API was made by a composition of works involving, Software Engineering, Data Science and Deep Learning techiniques to reach a small but conciselly peace of computation software. We have a problem to travel in a way who we can calculate the price of the distance achieved by a car or a person, both can be better to.

This work is for a small set of definiton with a dis- tance and weight we can calculate the price base on a multiplication of both field values and made a division by a hundred merge of the two fields.

A small expression have a need to confirm this knowledge in a large scale definition by logic proof using a tool called Tensorflow or Theano based just on the high-level API work named as Keras. That framework for Deep Learning, can reproduce the content a lot of times not just based on the proportion of the content growth, but using the amount of data proportion and definition of the problem set.

So we can start by the whole content who can be used on the model and API construction.

Sample input generation

The initial generation was made by a 10000 samples for (weight*distance) and (calculated price) using random data based on a division by 100. This can reproduce a simple case for a highly stocked currency. Targeting some large distances and small distances based on profitable patterns.

Definition

Based on a basic case of rmsprop optimizer to reproduce a linear regression case, for one dimensional sequential model with loss produced by a mean squared error with validation test case set fitting for a hundred epochs to without batch size defined explicitely.

Compilation

It's used the rmsprop because of problem, who have a basic understanding of the data. To be more precisely we need a real case data extracted from some place without hyphotesis, with that conquered answer from a small construction of code.

Accuracy & Prediction

That part is totally important for old hardware execute and train the model in a relevant momentum to act and use the saved model.

Analysis of the problem

This model needed to be used in another kind of techinology because of the portability and advancement of the further extractions. For example, it is used on an distributed archtecture who be activated by a small program, socket api or rest api. That piece of model can be used without the original source code in a large scale scenario with a life-time defined or not.

All can be visualized by process in different build pipelines a part, code coring is small than the other interation in the source code base at once.

Inherit the work

The model interface need to be the same and this is needed without any kind of transpilation because of the API definition of Keras and Keras js implementation. Utilize with one thing in mind, this can become the another project a black boxed one this is good and hard to maintain if loss the buffer.

Source code versioning like git-lfs can be deal with that nice and easy or using static storage systems, just depends on the way you we've gonna use their combination.

Scaling the solution

For a good usage of that complete solution, separate into different projects with model (modularized is better), module-model implementation, app calling the module and serving the api. Remember of the usage of two different kinds of languages can be confused to manage without a good reason and some goals described and pointed to be have a large proof-of-concept or to become a fact.

Deploying a solution

After all can be done you can just delivery the API stable without versioning increased for some parts this can reduce a lot the complexity presente by the challenge to delivery this kind of project and have a lot of gains in a fast way to become a native one.

Point of fact (Real world case of usage)

Large systems and distributes parallel computing without to change the market all the way down.