Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hpx::cout output lost #1173

Closed
LuisAyuso opened this issue Jul 2, 2014 · 11 comments
Closed

hpx::cout output lost #1173

LuisAyuso opened this issue Jul 2, 2014 · 11 comments

Comments

@LuisAyuso
Copy link

When utilising hpx::cout, and runing on two locations the console output produced by hpx::cout is lost.

Here a little sniplet:

typedef hpx::unique_future<void> wait_for_worker;
wait_for_worker worker()
{

    hpx::cout << "hello!" << hpx::endl;
    return hpx::make_ready_future();
}
HPX_PLAIN_ACTION(worker, worker_action);

int hpx_main(boost::program_options::variables_map& vm)
{
    std::vector<wait_for_worker> futures;

    // get locations and start workers
    for(auto l : hpx::find_all_localities()){
        futures.push_back(hpx::async(worker_action(),l));
    }
    hpx::wait_all(futures);

    return hpx::finalize();
}

int main(int argc, char* argv[])
{
    // Initialize HPX, run hpx_main as the first HPX thread, and
    // wait for hpx::finalize being called.
    return hpx::init(argc, argv);
}

Running on a single location produces the right output, the execution on two locations (different physical machines) produces no output

The funny thing is that introducing following statement in the worker action generates the appropriate output (plus an extra line per location)

    std::cout << "std::cout print" << std::endl;

the compilation was done with the following command:

/usr/bin/c++    -g -I<HPX_PREFIX>/include -I<HPX_PREFIX>/include/hpx/external  \
     -std=c++0x -o test1.cpp.o -c test1.cpp
@Syntaf
Copy link
Member

Syntaf commented Jul 2, 2014

Have you tried using hpx::flush? e.g.

hpx::cout << "Hello world\n" << hpx::flush;

@hkaiser
Copy link
Member

hkaiser commented Jul 3, 2014

Apparently this was already fixed since the latest release (which you're using). I added a test verifying it's fixed (based on the code you posted above). I'm closing this now as I'm not able to reproduce it with top of master. Please feel free to reopen, if needed.

@LuisAyuso
Copy link
Author

Still experiencing the problem. Even including hpx::flush
Yes, I do use 9.8 version which I believe is the latest release, the one provided by the download link.
I found out that the output is lost in some execution. It might be a conceptual problem:

Is hpx::flush a blocking operation? I mean, does the program execution wait for the completion for the underlying network transmission? so the future is made available for the calling action and the process ends before the stream is transmitted.

Or:
Could it be that the transmission of the standard stream is not guaranteed? by using UDP or similar?
in this way i could be loosing the packages and the "head" locality might not notice.

Thanks

@sithhell
Copy link
Member

sithhell commented Jul 3, 2014

Luis, could you try the last commit on the master branch? I think this was a known problem that existed in that particular release. Reopening for now. Please report about success/failure on the master branch.

@sithhell sithhell reopened this Jul 3, 2014
@hkaiser
Copy link
Member

hkaiser commented Jul 3, 2014

Yes, I do use 9.8 version which I believe is the latest release, the one provided by
the download link.

This seems to be a bug which was pertinent to that particular release and it looks like it has been fixed since then.

Is hpx::flush a blocking operation? I mean, does the program execution wait for the
completion for the underlying network transmission?

hpx::flush asynchronously sends of the data and returns immediately. Using this should work as a workaround for now.

@LuisAyuso
Copy link
Author

Hi again,
The git version does a better work with the streaming, every successful execution produces the right output. The problem with 9.9-trunk version is that approximately one every three executions works.
This is the new code:

#include <vector>

#include <hpx/hpx_init.hpp>
#include <hpx/include/lcos.hpp>
#include <hpx/include/actions.hpp>
#include <hpx/include/iostreams.hpp>

typedef hpx::lcos::future<int> wait_for_worker;
wait_for_worker worker( )
{
    hpx::cout <<" hello"  << hpx::endl;
    return hpx::make_ready_future(1);
}
HPX_PLAIN_ACTION(worker, worker_action);

int hpx_main(boost::program_options::variables_map& vm)
{
    // get locations and start workers
    std::vector<wait_for_worker> futures;
    for(auto l : hpx::find_all_localities()){
        futures.push_back(hpx::async<worker_action>(l));
    }
    hpx::wait_all(futures);   
    return hpx::finalize();
}

int main(int argc, char* argv[])
{
    return hpx::init(argc, argv);
}

I ran it this way:

remote: ./main -a ip0 -x ip0:7910 -l 2 
local:     ./main -a ip0:7910 -x ip1:7910 -w 

Most of the runs fail in the instance specified to wait for localities (remote) in this manner:

[...]/include/hpx/util/bind.hpp:270: void hpx::util::detail::one_shot_wrapper<F>::check_call() [with F = hpx::actions::action<hpx::lcos::base_lco, hpx::util::unused_type, hpx::util::tuple<>, hpx::actions::direct_action0<void (hpx::lcos::base_lco::*)(), &hpx::lcos::base_lco::set_event_nonvirt, hpx::actions::detail::this_type> >::continuation_thread_object_function_void_0]: Assertion `!_called' failed.

I guess this problem might be addressed by some other ticket, I think I will just use 9.8 without streams until 9.9 turns stable.

Thanks for your time anyway.

@K-ballo
Copy link
Member

K-ballo commented Jul 4, 2014

Could you provide a full log for that assertion?

@hkaiser
Copy link
Member

hkaiser commented Jul 4, 2014

I'm not able to reproduce this. What compiler do you use? What cmake/compiler command line? Can you try running the regression test regression/iostreams/no_output_1173 - does that work?

@hkaiser hkaiser reopened this Jul 4, 2014
@sithhell
Copy link
Member

Is this still a problem?

@hkaiser
Copy link
Member

hkaiser commented Sep 12, 2014

This is related to #1229

@hkaiser
Copy link
Member

hkaiser commented Oct 25, 2014

This should be fixed now as #1229 has been closed. Please reopen if the problem persists.

@hkaiser hkaiser closed this as completed Oct 25, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants