Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"fix unsigned compare problem" #5359

Merged
merged 4 commits into from
Nov 6, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 3 additions & 10 deletions paddle/optimizer/CMakeLists.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
include_directories(${CMAKE_CURRENT_BINARY_DIR})

set(OPITMIZER_SRCS
adadelta_optimizer.cc
adagrad_optimizer.cc
Expand All @@ -9,11 +7,6 @@ set(OPITMIZER_SRCS
sgd_optimizer.cc
)

add_library(paddle_optimizer STATIC ${OPITMIZER_SRCS})
add_dependencies(paddle_optimizer paddle_proto ${external_project_dependencies})
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TeamCity fails at

[15:45:21]	# github.com/PaddlePaddle/Paddle/go/pserver
[15:45:21]	/tmp/go-build227723758/github.com/PaddlePaddle/Paddle/go/pserver/_obj/optimizer.cgo2.o: In function `_cgo_eff63c273c7c_Cfunc_paddle_create_optimizer':
[15:45:21]	pserver/cgo-gcc-prolog:79: undefined reference to `paddle_create_optimizer'
[15:45:21]	/tmp/go-build227723758/github.com/PaddlePaddle/Paddle/go/pserver/_obj/optimizer.cgo2.o: In function `_cgo_eff63c273c7c_Cfunc_paddle_optimizer_get_state':
[15:45:21]	pserver/cgo-gcc-prolog:98: undefined reference to `paddle_optimizer_get_state'
[15:45:21]	/tmp/go-build227723758/github.com/PaddlePaddle/Paddle/go/pserver/_obj/optimizer.cgo2.o: In function `_cgo_eff63c273c7c_Cfunc_paddle_optimizer_get_weights':
[15:45:21]	pserver/cgo-gcc-prolog:117: undefined reference to `paddle_optimizer_get_weights'
[15:45:21]	/tmp/go-build227723758/github.com/PaddlePaddle/Paddle/go/pserver/_obj/optimizer.cgo2.o: In function `_cgo_eff63c273c7c_Cfunc_paddle_release_optimizer':
[15:45:21]	pserver/cgo-gcc-prolog:135: undefined reference to `paddle_release_optimizer'
[15:45:21]	/tmp/go-build227723758/github.com/PaddlePaddle/Paddle/go/pserver/_obj/optimizer.cgo2.o: In function `_cgo_eff63c273c7c_Cfunc_paddle_update_parameter':
[15:45:21]	pserver/cgo-gcc-prolog:158: undefined reference to `paddle_update_parameter'
[15:45:21]	collect2: error: ld returned 1 exit status
[15:45:21]	make[2]: *** [go/cmd/pserver/pserver_timestamp] Error 2
[15:45:21]	go/cmd/pserver/CMakeFiles/pserver.dir/build.make:61: recipe for target 'go/cmd/pserver/pserver_timestamp' failed
[15:45:21]	make[1]: *** [go/cmd/pserver/CMakeFiles/pserver.dir/all] Error 2
[15:45:21]	make[1]: *** Waiting for unfinished jobs....

It could be that we cannot delete the line

add_dependencies(paddle_optimizer paddle_proto ${external_project_dependencies})

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I confirmed that above is not the reason -- even if I add that line back, it still fails.

I noticed that CMake does no longer think that the target paddle_go_optimizer depends on paddle_optimizer. The GraphViz file generated by CMake includes the follows lines that refer to paddle_go_optimizer:

    "node62" [ label="paddle_go_optimizer" shape="diamond"];
    "node62" -> "node468" // paddle_go_optimizer -> glog
    "node62" -> "node469" // paddle_go_optimizer -> gflags
    "node62" -> "node471" // paddle_go_optimizer -> -lpthread
    "node62" -> "node31" // paddle_go_optimizer -> paddle_proto
    "node62" -> "node473" // paddle_go_optimizer -> mklml
    "node62" -> "node474" // paddle_go_optimizer -> zlib
    "node62" -> "node472" // paddle_go_optimizer -> protobuf
    "node62" -> "node475" // paddle_go_optimizer -> mkldnn
    "node62" -> "node476" // paddle_go_optimizer -> warpctc
    "node477" [ label="stdc++" shape="ellipse"];
    "node62" -> "node477" // paddle_go_optimizer -> stdc++
    "node478" [ label="m" shape="ellipse"];
    "node62" -> "node478" // paddle_go_optimizer -> m

where paddle_optimizer doesn't appear.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because I add a namespace to these C interfaces, so the paddle_optimizer_get_weights goes to paddle::optimizer::paddle_optimizer_get_weights , then it failed.

The dependency graph is quite strange, I have tried the develop branch, which also lacks the link between paddle_go_optimizer and paddle_optimizer.



if(WITH_TESTING)
add_simple_unittest(serialization_test)
add_simple_unittest(parameter_optimizer_test)
endif()
cc_library(paddle_optimizer STATIC SRCS ${OPITMIZER_SRCS} DEPS paddle_proto glog)
cc_test(serialization_test SRCS serialization_test.cc DEPS paddle_proto)
cc_test(parameter_optimizer_test SRCS parameter_optimizer_test.cc DEPS paddle_optimizer)
14 changes: 14 additions & 0 deletions paddle/optimizer/adadelta_optimizer.cc
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#include "adadelta_optimizer.h"
#include <algorithm>
#include <cmath>
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/adadelta_optimizer.h
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#pragma once

#include "parameter_optimizer.h"
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/adagrad_optimizer.cc
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#include <cmath>

#include "adagrad_optimizer.h"
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/adagrad_optimizer.h
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#pragma once

#include "parameter_optimizer.h"
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/adam_optimizer.cc
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#include "adam_optimizer.h"
#include <cmath>

Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/adam_optimizer.h
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#pragma once

#include "parameter_optimizer.h"
Expand Down
37 changes: 25 additions & 12 deletions paddle/optimizer/optimizer.cc
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#include "optimizer.h"
#include <glog/logging.h>
#include <cstdlib>
Expand All @@ -6,31 +20,30 @@

#include "parameter_optimizer.h"

using namespace paddle;
using namespace paddle::optimizer;
using paddle::optimizer::ParameterOptimizer;
using paddle::optimizer::Tensor;

template <paddle_element_type VALUE>
struct EnumToType {};

template <class T>
struct TypeToEnum {};

#define MATCH_ENUM_TYPE(TYPE, ENUM) \
template <> \
struct TypeToEnum<TYPE> { \
static paddle_element_type v() { return ENUM; }; \
static constexpr TYPE value = ENUM; \
}; \
template <> \
struct EnumToType<ENUM> { \
typedef TYPE Type; \
#define MATCH_ENUM_TYPE(TYPE, ENUM) \
template <> \
struct TypeToEnum<TYPE> { \
static paddle_element_type v() { return ENUM; } \
static constexpr TYPE value = ENUM; \
}; \
template <> \
struct EnumToType<ENUM> { \
typedef TYPE Type; \
}

MATCH_ENUM_TYPE(int32_t, PADDLE_ELEMENT_TYPE_INT32);
MATCH_ENUM_TYPE(uint32_t, PADDLE_ELEMENT_TYPE_UINT32);
MATCH_ENUM_TYPE(int64_t, PADDLE_ELEMENT_TYPE_INT64);
MATCH_ENUM_TYPE(uint64_t, PADDLE_ELEMENT_TYPE_UINT64);
// TODO(zhihong): only implement below type, need to fix
MATCH_ENUM_TYPE(float, PADDLE_ELEMENT_TYPE_FLOAT32);
MATCH_ENUM_TYPE(double, PADDLE_ELEMENT_TYPE_FLOAT64);

Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/optimizer.h
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#pragma once

#include <stdbool.h>
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/parameter_optimizer.cc
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#include <glog/logging.h>
#include "adadelta_optimizer.h"
#include "adagrad_optimizer.h"
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/parameter_optimizer.h
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#pragma once

#include <glog/logging.h>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ class OptimizerTest : public testing::Test {

int s = 0;
float* newp = (float*)opts_[i]->get_weight(&s);
EXPECT_EQ(s, kSize);
EXPECT_EQ(static_cast<size_t>(s), kSize);
for (size_t j = 0; j < kSize; ++j) {
EXPECT_EQ(newp[j], (*p)[j]);
}
Expand Down
14 changes: 14 additions & 0 deletions paddle/optimizer/sgd_optimizer.cc
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#include "sgd_optimizer.h"
#include "serialization.h"

Expand Down
15 changes: 14 additions & 1 deletion paddle/optimizer/sgd_optimizer.h
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
/* Copyright (c) 2016 PaddlePaddle Authors. All Rights Reserve.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. */

#pragma once

#include "parameter_optimizer.h"
Expand All @@ -15,7 +29,6 @@ class SGDOptimizer : public ParameterOptimizer {
nesterov_(n) {
if (momentum_ != 0.0) {
size_t size = parameter->size();
// TODO: fix it with align aware allocator bind to Tensor
momentums_ = new Tensor(size);
}
}
Expand Down