Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

unresponsive agents over time OpenStack #96

Closed
rkoster opened this issue Mar 25, 2013 · 30 comments
Closed

unresponsive agents over time OpenStack #96

rkoster opened this issue Mar 25, 2013 · 30 comments

Comments

@rkoster
Copy link
Contributor

rkoster commented Mar 25, 2013

I have deployed a micro bosh on OpenStack and with this micro bosh a normal bosh has been deployed. I have been deploying other stuff with this normal bosh on friday. Over the weekend however some of the instances in the bosh deployment have become unresponsive. I get the following when bosh vms while targeting the micro bosh.

+------------------+--------------------+---------------+-------------+
| Job/index        | State              | Resource Pool | IPs         |
+------------------+--------------------+---------------+-------------+
| unknown/unknown  | unresponsive agent |               |             |
| unknown/unknown  | unresponsive agent |               |             |
| unknown/unknown  | unresponsive agent |               |             |
| unknown/unknown  | unresponsive agent |               |             |
| blobstore/0      | running            | small         | 10.200.7.9  |
| health_monitor/0 | running            | small         | 10.200.7.7  |
| postgres/0       | running            | small         | 10.200.7.12 |
| redis/0          | running            | small         | 10.200.7.10 |
+------------------+--------------------+---------------+-------------+

I'm currently running Version 1.5.0.pre2 (release:346bb97d bosh:346bb97d) which is created from master one week ago. I created a micro-bosh and a normal stemcell from this same commit.

@oppegard
Copy link
Member

Are you still having this problem with OpenStack stemcells? If so, have you tried building a newer one that may have addressed the issue?

@rkoster
Copy link
Contributor Author

rkoster commented May 30, 2013

I'm currently using the ones from CI.

+---------------+---------+--------------------------------------+
| Name          | Version | CID                                  |
+---------------+---------+--------------------------------------+
| bosh-stemcell | 661     | 80eaa2bb-ef8e-46ae-99b1-c7332e549453 |
+---------------+---------+--------------------------------------+

But still have the problem from time to time. I found however a workaround: doing a monit stop registry && monit start registry on the microbosh.

I currently have a microbosh deployed so what would be helpfull logging to further debug this problem?

@oppegard
Copy link
Member

How recent is the microbosh code ('bosh status' should show the git sha). Do the registry logs under /var/vcap/sys/log/registry show anything of interest?

@frodenas have you seen behavior like this on openstack?

@rkoster
Copy link
Contributor Author

rkoster commented May 30, 2013

I'm currently running Version 1.5.0.pre.661 (release:2a3c861a bosh:2a3c861a)
WIll get back on logs.

@rkoster
Copy link
Contributor Author

rkoster commented May 31, 2013

I found the following stacks-trace in /var/vcap/sys/log/egistry/registry.stderr.log:

Excon::Errors::Unauthorized - Expected([200, 204]) <=> Actual(401 Unauthorized):
        /var/vcap/packages/registry/gem_home/gems/excon-0.22.1/lib/excon/middlewares/expects.rb:10:in `response_call'
        /var/vcap/packages/registry/gem_home/gems/excon-0.22.1/lib/excon/connection.rb:355:in `response'
        /var/vcap/packages/registry/gem_home/gems/excon-0.22.1/lib/excon/connection.rb:249:in `request'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/core/connection.rb:21:in `request'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack.rb:194:in `retrieve_tokens_v2'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack.rb:87:in `authenticate_v2'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/compute.rb:387:in `authenticate'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/compute.rb:347:in `rescue in request'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/compute.rb:333:in `request'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/requests/compute/list_servers_detail.rb:15:in `list_servers_detail'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/models/compute/servers.rb:21:in `all'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/core/collection.rb:141:in `lazy_load'
        /var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/core/collection.rb:22:in `each'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager/openstack.rb:45:in `find'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager/openstack.rb:45:in `instance_ips'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager.rb:45:in `check_instance_ips'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager.rb:29:in `read_settings'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/api_controller.rb:22:in `block in <class:ApiController>'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `call'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `block in compile!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `[]'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (3 levels) in route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:960:in `route_eval'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (2 levels) in route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:981:in `block in process_route'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `catch'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `process_route'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:943:in `block in route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `each'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1053:in `block in dispatch!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `block in invoke'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `catch'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `invoke'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1050:in `dispatch!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:878:in `block in call!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `block in invoke'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `catch'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `invoke'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:878:in `call!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:864:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/xss_header.rb:18:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/path_traversal.rb:16:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/json_csrf.rb:18:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/base.rb:49:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/base.rb:49:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/frame_options.rb:31:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/nulllogger.rb:9:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/head.rb:11:in `call'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/showexceptions.rb:21:in `call'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:172:in `call'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1947:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/builder.rb:138:in `call'
        /var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/urlmap.rb:65:in `block in call'
        /var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/urlmap.rb:50:in `each'
        /var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/urlmap.rb:50:in `call'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:81:in `block in pre_process'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:79:in `catch'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:79:in `pre_process'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:54:in `process'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:39:in `receive_data'
        /var/vcap/packages/registry/gem_home/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run_machine'
        /var/vcap/packages/registry/gem_home/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/backends/base.rb:63:in `start'
        /var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/server.rb:159:in `start'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/runner.rb:34:in `start_http_server'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/runner.rb:18:in `run'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/bin/bosh_registry:28:in `<top (required)>'
        /var/vcap/packages/registry/bin/bosh_registry:23:in `load'
        /var/vcap/packages/registry/bin/bosh_registry:23:in `<main>'

@rkoster
Copy link
Contributor Author

rkoster commented May 31, 2013

I encountered the problem again today. It happened when I added the echo service to my deployment. For this packages needed to be compiled and as a consequence new vms needed to be created. The creation process all went fine (vms where created when checking in horizon) but then the compiling did not start.
While the bosh deploy task was still waiting on the compilation vms, when the above fix was applied (registry restart) the deployment continued.

@frodenas
Copy link
Contributor

Seems that the registry is losing the connection to OpenStack, it tries to reauthenticate but it fails.

@frodenas
Copy link
Contributor

There's a bug in the fog gem. Once a user token has expired, it doesn't reauthenticate because it using the same token again and doesn't ask for another new token.

@abic
Copy link
Contributor

abic commented Jun 3, 2013

@frodenas: is there a issue and or pull request upstream for this issue?

@frodenas
Copy link
Contributor

frodenas commented Jun 3, 2013

PR #235 - Issue #50935167

@rkoster
Copy link
Contributor Author

rkoster commented Jun 3, 2013

Nice

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

Today I had a compilation VM that couldn't authenticate with registry. Hopefully related and hopefully fixed.

Has a new microbosh come out since #235 was merged?

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

Having said the above, my compilation VM's user_data.json does not contain user:pass for the registry:

{"registry":{"endpoint":"http://10.0.0.2:25777"}, ...

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

I have deployed the wordpress example a few days ago; but tonight its failing as above on a new deployment.

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

How do you kill a deployment when the compilation VM is hanging due to agent issues? Timeout takes forever when you know its a glitch.

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

Upgrading from 676 to 693 to see if it fixes issue.

@frodenas
Copy link
Contributor

frodenas commented Jun 4, 2013

@drnic, the 698 stemcell doesn't contain the #235 PR, it'll be in stemcell >=704 (not yet published). The user-data usually doesn't contain the user/pwd for the registry (except in the microbosh vm). The bosh_registry implements a security mechanism when reading settings that checks that the ip of the vm asking for settings is the same as the ip of the settings requested. It'll be useful to see the vm logs to check exactly what's happening in your case.

Regarding cancelling a compilation VM, actually it's not possible. We've an story in our backlog to deal with this issue.

@frodenas
Copy link
Contributor

frodenas commented Jun 4, 2013

Correction: The patch is included in stemcell >= 703 (it has been published just a few minutes ago).

@rkoster Can you please try the latest stemcell?

@rkoster
Copy link
Contributor Author

rkoster commented Jun 4, 2013

I have deployed 703 but the director has some problems while starting.

cat /var/vcap/sys/log/director/migrate.stderr.log

/var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:208:in `initialize': PG::Error: FATAL:  role "bosh" does not exist (Sequel::DatabaseConnectionError)
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:208:in `new'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:208:in `connect'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool.rb:94:in `make_new'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:164:in `make_new'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:137:in `available'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:127:in `block in acquire'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:195:in `block in sync'
        from <internal:prelude>:10:in `synchronize'

@rkoster
Copy link
Contributor Author

rkoster commented Jun 4, 2013

Seems like default bosh postgress user has been changed
Already tried:

/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "create role \"bosh\" NOSUPERUSER LOGIN INHERIT CREATEDB"
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "alter role \"bosh\" with password 'bosh'"

tail /var/vcap/sys/log/director/migrate.stderr.log now gives:

/var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:145:in `exec': PG::Error: ERROR:  relation "schema_migrations" already exists (Sequel::DatabaseError)
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:145:in `block in execute_query'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/database/logging.rb:33:in `log_yield'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:145:in `execute_query'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:132:in `block in execute'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:111:in `check_disconnect_errors'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:132:in `execute'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:413:in `_execute'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:242:in `block (2 levels) in execute'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:425:in `check_database_errors'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:242:in `block in execute'
        from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/database/connecting.rb:236:in `block in synchronize'

@rkoster
Copy link
Contributor Author

rkoster commented Jun 4, 2013

Fixed the problem of the failed postgres migration with:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "REASSIGN OWNED BY postgres TO bosh"

Now have successfully deployed microbosh 703.

If the problem still persists I should manifest itself within one day.
Will report back tomorrow.

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

Can you create a ticket to add this as a migration?

Dr Nic Williams
Stark & Wayne LLC - the consultancy for Cloud Foundry
http://starkandwayne.com
+1 415 860 2185
twitter: drnic

On Tue, Jun 4, 2013 at 7:23 AM, Ruben Koster notifications@github.com
wrote:

Fixed the problem of the failed postgres migration with:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "REASSIGN OWNED BY postgres TO bosh"
Now have successfully deployed microbosh 703.
If the problem still persists I should manifest itself within one day.

Will report back tomorrow.

Reply to this email directly or view it on GitHub:
#96 (comment)

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

Perhaps it's not possible to migrate actually. So do we need to add properties to legacy micro_bosh.yml?

Dr Nic Williams
Stark & Wayne LLC - the consultancy for Cloud Foundry
http://starkandwayne.com
+1 415 860 2185
twitter: drnic

On Tue, Jun 4, 2013 at 7:23 AM, Ruben Koster notifications@github.com
wrote:

Fixed the problem of the failed postgres migration with:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "REASSIGN OWNED BY postgres TO bosh"
Now have successfully deployed microbosh 703.
If the problem still persists I should manifest itself within one day.

Will report back tomorrow.

Reply to this email directly or view it on GitHub:
#96 (comment)

@pmenglund
Copy link
Contributor

The regression happened as our CI system unfortunately doesn't test upgrades, just clean installs. Sorry about that, I'll make sure someone looks at fixing it.

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

Perhaps a ticket/feature to test n-1 -> n upgrades?

Dr Nic Williams
Stark & Wayne LLC - the consultancy for Cloud Foundry
http://starkandwayne.com
+1 415 860 2185
twitter: drnic

On Tue, Jun 4, 2013 at 7:30 AM, Martin Englund notifications@github.com
wrote:

The regression happened as our CI system unfortunately doesn't test upgrades, just clean installs. Sorry about that, I'll make sure someone looks at fixing it.

Reply to this email directly or view it on GitHub:
#96 (comment)

@gabis
Copy link
Contributor

gabis commented Jun 4, 2013

Both of those stories exist. We have the failed update bug at the top of the backlog so it will get picked up next. We have CI upgrade stories for each platform for micro and full bosh which are also prioritized in the backlog. CI improvements are the current focus of the team, so we anticipate coverage for these cases within a few weeks.

@drnic
Copy link
Contributor

drnic commented Jun 4, 2013

xoxo to the ci team!

@frodenas
Copy link
Contributor

frodenas commented Jun 6, 2013

@rkoster Did your agents lost the connection again?

@rkoster
Copy link
Contributor Author

rkoster commented Jun 7, 2013

The problem did not reappeared. Have tried increasing the size of the deployment and the machines were added without problems. I also don't see the connection problem anymore in the registry log.

I only see the following stacktace which does not seem to cause problems. If this stacktrace is not expected I will create an new issue for it.

Bosh::Registry::InstanceNotFound - Can't find instance `vm-bedb1b6d-1e0e-4c2a-96fb-f70eb97d5093':
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.703/lib/bosh_registry/instance_manager.rb:57:in `get_instance'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.703/lib/bosh_registry/instance_manager.rb:31:in `read_settings'
        /var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.703/lib/bosh_registry/api_controller.rb:22:in `block in <class:ApiController>'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `call'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `block in compile!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `[]'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (3 levels) in route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:960:in `route_eval'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (2 levels) in route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:981:in `block in process_route'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `catch'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `process_route'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:943:in `block in route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `each'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `route!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1053:in `block in dispatch!'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `block in invoke'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `catch'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `invoke'
        /var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1050:in `dispatch!'

@rkoster rkoster closed this as completed Jun 7, 2013
@frodenas
Copy link
Contributor

frodenas commented Jun 7, 2013

Thanks @rkoster for reporting back! Reopen the issue if the bug appears again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants