Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove erroneous random forest application #726

Merged

Conversation

zachgk
Copy link
Contributor

@zachgk zachgk commented Mar 5, 2021

The application in the only place used was changed to the more accurate softmax_regression (matching
the terminology from the D2L book).

@zachgk zachgk requested a review from a team March 5, 2021 01:42
@codecov-io
Copy link

codecov-io commented Mar 5, 2021

Codecov Report

Merging #726 (8ef5b64) into master (48cf663) will increase coverage by 0.05%.
The diff coverage is 100.00%.

Impacted file tree graph

@@             Coverage Diff              @@
##             master     #726      +/-   ##
============================================
+ Coverage     68.95%   69.01%   +0.05%     
- Complexity     3966     3974       +8     
============================================
  Files           462      462              
  Lines         18634    18633       -1     
  Branches       1998     1998              
============================================
+ Hits          12850    12859       +9     
+ Misses         4753     4751       -2     
+ Partials       1031     1023       -8     
Impacted Files Coverage Δ Complexity Δ
.../main/java/ai/djl/onnxruntime/zoo/OrtModelZoo.java 100.00% <ø> (ø) 4.00 <0.00> (ø)
...ime/zoo/tabular/softmax_regression/IrisFlower.java 100.00% <ø> (ø) 5.00 <0.00> (?)
api/src/main/java/ai/djl/Application.java 56.86% <100.00%> (-0.83%) 8.00 <0.00> (ø)
...tmax_regression/IrisClassificationModelLoader.java 90.00% <100.00%> (ø) 2.00 <1.00> (?)
api/src/main/java/ai/djl/inference/Predictor.java 87.50% <0.00%> (-2.89%) 23.00% <0.00%> (-1.00%)
api/src/main/java/ai/djl/training/Trainer.java 95.83% <0.00%> (+3.12%) 34.00% <0.00%> (+1.00%)
api/src/main/java/ai/djl/repository/Artifact.java 93.10% <0.00%> (+3.44%) 38.00% <0.00%> (+2.00%)
...djl/training/listener/LoggingTrainingListener.java 88.49% <0.00%> (+5.30%) 28.00% <0.00%> (+6.00%)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 48cf663...8ef5b64. Read the comment docs.

@zachgk zachgk force-pushed the noRandomForestApplication branch from 3156ce6 to 8110169 Compare March 9, 2021 00:12
The application was changed to the more accurate softmax_regression (matching
the terminology from the D2L book).

Change-Id: I1f69f005bbe38b125f2709c2988d06c14eebb765
@zachgk zachgk force-pushed the noRandomForestApplication branch from 8110169 to f8d3f2a Compare March 9, 2021 01:25
@lanking520 lanking520 merged commit 43e5891 into deepjavalibrary:master Mar 9, 2021
@zachgk zachgk deleted the noRandomForestApplication branch March 9, 2021 19:41
zachgk added a commit that referenced this pull request Mar 11, 2021
* This creates the component which will populate the Download Tab with Download Buttons.

* Making a place for the download buttons.

* Adding the Model Download Handler allowing the backend to feed the links into the Model View and making slight changes for readablity.

* Getting rid of some of the test code.

* Improve Block usability (#712)

* Use builder pattern for Parameter (#661)

* Make XavierInitializer default value & Improve setInitializer (#664)

* Refactor initialize (#675)

* Remove NDManager on getOutputShapes (#710)

* Removing unnecessary logging messages.

* block factory init commit (#697)

* [DOCS] Fixing TrainingListener documentation (#718)

* Fixing TrainingListener documentation

* Fixing PR reviews

* Fix DJL serving flaky test for mac (#721)

Change-Id: I9eccc84b0c34652e50c5fe5a4fe42f2b82d65a3d

* Fixing all of the nits.

* Getting rid of unnecessary methods.

* update onnxruntime along with String tensor (#724)

* Add profiler doc (#722)

* Resolving some comments.

* Using a better criteria incase multiple models have the same name.

* Fixing the java doc.

* Configure verbose of mxnet extra libraries (#728)

Change-Id: I66d54aa496cccbb9e8c0a89eeaa458605958d9c6

* Added a TODO for using the artifact repo to get the base uri.

* paddlepaddle CN notebook (#730)

* paddlepaddle CN notebook

* install font

Change-Id: I2d749e617b0bf78ecbcd168b82c53a1fab49a2c0

* refactor on name

Change-Id: I9e379eee51ceae16391850b3ba9782acb04c4021

* Refine the text

Co-authored-by: gstu1130 <gstu1130@gmail.com>

* add EI documentation (#733)

* add EI documentation

* fix pmd rules

Change-Id: Ieee5577c26f6df2843781f8f9180de35069a5de3

* allow pytorch stream model loading (#729)

* allow pytorch stream model loading

* updates

Change-Id: Ibc26261b90de673712e90de0d640a8f32f23763e

* add NDList decode from inputStream (#734)

Change-Id: I6a31d8b0b955f2dbb762220b101e3928a34699c1

* Remove memory scope and improve memory management (#695)

The MemoryScope reveals a number of shortcomings within the DJL memory
management. While the MemoryScope is deleted, many of them are fixed as part of
this PR.

First, the NDManager.{attach, detach} were renamed to xxxInternal. This is to
differentiate them from the attach and detach methods that are intended to be used.

There are two new concepts in memory management. An NDResource interface was
created to combine the concepts of managed memory that was used in NDArray and
NDList. It could also be used in more classes in the future. This includes the
getManager, attach, and detach.

Within the NDManager, it gains a second "management convention". The first
convention of normal resources are added to the manager and then closed when the
manager closes. This works for small numbers of things on the NDArray, but not
when operations transitively create. So, the second convention is a
tempResource. Instead of freeing them when the manager is closed, they are
returned to their original manager. This is used to create a temporary scope, do
operations within it, and then the inputs and return value are returned to the
parent while the intermediate work is cleaned. This also matches the concepts of
ownership/borrowing as well.

Using these, a few additional helper methods were created. There is
`NDManager.from(resource)` to ease creation of managers based on a resource.
There is also `scopeManager.ret(returnValue)` to help with returning values
outside of the scopeManager. Lastly, there is a `scopeManager.{temp,}AttachAll`
to attach a number of resources to a manager within a single call.

Using these improvements, the new method were applied to the old locations where
MemoryScope was used as well as an additional case in NDManagerEx.

Also, the old attach methods were altered to be `void`. Because the return
values are no longer used anywhere and are not as necessary in the current
scheme, I figured it would simplify things. It also helps for things like
`NDList.attach` which does not have a single original NDManager when attaching.

Change-Id: I91d109cd14d70fa64fd8fffa0b50d88ab053013e

* Remove erroneous random forest application (#726)

The application was changed to the more accurate softmax_regression (matching
the terminology from the D2L book).

Change-Id: I1f69f005bbe38b125f2709c2988d06c14eebb765

* Minor fixes on duplicated code (#736)

* remove methods that already defined in the NDArrayAdapter

Change-Id: I01cc03a7f5b427bf31c6b3fd8d2136f2a27fe93b

* refactor toString

Change-Id: Iea22b16e1daa9f759b55c1a8b8b85536482e551a

* remove sparse NDArray

Change-Id: Icb44096519775f54cb32cc768c14f49e33dc7ea5

* fix test

Change-Id: Icef580ed77e7bba22864ce44577de3cba51e3e41

Co-authored-by: Jake Lee <gstu1130@gmail.com>
Co-authored-by: Lanking <lanking520@live.com>
Co-authored-by: aksrajvanshi <aksrajvanshi@gmail.com>
Co-authored-by: Frank Liu <frankfliu2000@gmail.com>
Co-authored-by: Zach Kimberg <kimbergz@amazon.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants