Comments (30)
Are you still having this problem with OpenStack stemcells? If so, have you tried building a newer one that may have addressed the issue?
from bosh.
I'm currently using the ones from CI.
+---------------+---------+--------------------------------------+
| Name | Version | CID |
+---------------+---------+--------------------------------------+
| bosh-stemcell | 661 | 80eaa2bb-ef8e-46ae-99b1-c7332e549453 |
+---------------+---------+--------------------------------------+
But still have the problem from time to time. I found however a workaround: doing a monit stop registry && monit start registry
on the microbosh.
I currently have a microbosh deployed so what would be helpfull logging to further debug this problem?
from bosh.
How recent is the microbosh code ('bosh status' should show the git sha). Do the registry logs under /var/vcap/sys/log/registry show anything of interest?
@frodenas have you seen behavior like this on openstack?
from bosh.
I'm currently running Version 1.5.0.pre.661 (release:2a3c861a bosh:2a3c861a)
WIll get back on logs.
from bosh.
I found the following stacks-trace in /var/vcap/sys/log/egistry/registry.stderr.log
:
Excon::Errors::Unauthorized - Expected([200, 204]) <=> Actual(401 Unauthorized):
/var/vcap/packages/registry/gem_home/gems/excon-0.22.1/lib/excon/middlewares/expects.rb:10:in `response_call'
/var/vcap/packages/registry/gem_home/gems/excon-0.22.1/lib/excon/connection.rb:355:in `response'
/var/vcap/packages/registry/gem_home/gems/excon-0.22.1/lib/excon/connection.rb:249:in `request'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/core/connection.rb:21:in `request'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack.rb:194:in `retrieve_tokens_v2'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack.rb:87:in `authenticate_v2'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/compute.rb:387:in `authenticate'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/compute.rb:347:in `rescue in request'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/compute.rb:333:in `request'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/requests/compute/list_servers_detail.rb:15:in `list_servers_detail'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/openstack/models/compute/servers.rb:21:in `all'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/core/collection.rb:141:in `lazy_load'
/var/vcap/packages/registry/gem_home/gems/fog-1.11.1/lib/fog/core/collection.rb:22:in `each'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager/openstack.rb:45:in `find'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager/openstack.rb:45:in `instance_ips'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager.rb:45:in `check_instance_ips'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/instance_manager.rb:29:in `read_settings'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/api_controller.rb:22:in `block in <class:ApiController>'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `call'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `block in compile!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `[]'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (3 levels) in route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:960:in `route_eval'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (2 levels) in route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:981:in `block in process_route'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `catch'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `process_route'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:943:in `block in route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `each'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1053:in `block in dispatch!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `block in invoke'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `catch'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `invoke'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1050:in `dispatch!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:878:in `block in call!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `block in invoke'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `catch'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `invoke'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:878:in `call!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:864:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/xss_header.rb:18:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/path_traversal.rb:16:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/json_csrf.rb:18:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/base.rb:49:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/base.rb:49:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-protection-1.5.0/lib/rack/protection/frame_options.rb:31:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/nulllogger.rb:9:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/head.rb:11:in `call'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/showexceptions.rb:21:in `call'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:172:in `call'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1947:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/builder.rb:138:in `call'
/var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/urlmap.rb:65:in `block in call'
/var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/urlmap.rb:50:in `each'
/var/vcap/packages/registry/gem_home/gems/rack-1.5.2/lib/rack/urlmap.rb:50:in `call'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:81:in `block in pre_process'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:79:in `catch'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:79:in `pre_process'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:54:in `process'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/connection.rb:39:in `receive_data'
/var/vcap/packages/registry/gem_home/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run_machine'
/var/vcap/packages/registry/gem_home/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in `run'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/backends/base.rb:63:in `start'
/var/vcap/packages/registry/gem_home/gems/thin-1.5.1/lib/thin/server.rb:159:in `start'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/runner.rb:34:in `start_http_server'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/lib/bosh_registry/runner.rb:18:in `run'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.661/bin/bosh_registry:28:in `<top (required)>'
/var/vcap/packages/registry/bin/bosh_registry:23:in `load'
/var/vcap/packages/registry/bin/bosh_registry:23:in `<main>'
from bosh.
I encountered the problem again today. It happened when I added the echo service to my deployment. For this packages needed to be compiled and as a consequence new vms needed to be created. The creation process all went fine (vms where created when checking in horizon) but then the compiling did not start.
While the bosh deploy
task was still waiting on the compilation vms, when the above fix was applied (registry restart) the deployment continued.
from bosh.
Seems that the registry is losing the connection to OpenStack, it tries to reauthenticate but it fails.
from bosh.
There's a bug in the fog gem. Once a user token has expired, it doesn't reauthenticate because it using the same token again and doesn't ask for another new token.
from bosh.
@frodenas: is there a issue and or pull request upstream for this issue?
from bosh.
from bosh.
Nice
from bosh.
Today I had a compilation VM that couldn't authenticate with registry. Hopefully related and hopefully fixed.
Has a new microbosh come out since #235 was merged?
from bosh.
Having said the above, my compilation VM's user_data.json
does not contain user:pass for the registry:
{"registry":{"endpoint":"http://10.0.0.2:25777"}, ...
from bosh.
I have deployed the wordpress example a few days ago; but tonight its failing as above on a new deployment.
from bosh.
How do you kill a deployment when the compilation VM is hanging due to agent issues? Timeout takes forever when you know its a glitch.
from bosh.
Upgrading from 676 to 693 to see if it fixes issue.
from bosh.
@drnic, the 698 stemcell doesn't contain the #235 PR, it'll be in stemcell >=704 (not yet published). The user-data usually doesn't contain the user/pwd for the registry (except in the microbosh vm). The bosh_registry implements a security mechanism when reading settings that checks that the ip of the vm asking for settings is the same as the ip of the settings requested. It'll be useful to see the vm logs to check exactly what's happening in your case.
Regarding cancelling a compilation VM, actually it's not possible. We've an story in our backlog to deal with this issue.
from bosh.
Correction: The patch is included in stemcell >= 703 (it has been published just a few minutes ago).
@rkoster Can you please try the latest stemcell?
from bosh.
I have deployed 703 but the director has some problems while starting.
cat /var/vcap/sys/log/director/migrate.stderr.log
/var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:208:in `initialize': PG::Error: FATAL: role "bosh" does not exist (Sequel::DatabaseConnectionError)
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:208:in `new'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:208:in `connect'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool.rb:94:in `make_new'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:164:in `make_new'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:137:in `available'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:127:in `block in acquire'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/connection_pool/threaded.rb:195:in `block in sync'
from <internal:prelude>:10:in `synchronize'
from bosh.
Seems like default bosh postgress user has been changed
Already tried:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "create role \"bosh\" NOSUPERUSER LOGIN INHERIT CREATEDB"
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "alter role \"bosh\" with password 'bosh'"
tail /var/vcap/sys/log/director/migrate.stderr.log
now gives:
/var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:145:in `exec': PG::Error: ERROR: relation "schema_migrations" already exists (Sequel::DatabaseError)
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:145:in `block in execute_query'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/database/logging.rb:33:in `log_yield'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:145:in `execute_query'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:132:in `block in execute'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:111:in `check_disconnect_errors'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:132:in `execute'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:413:in `_execute'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:242:in `block (2 levels) in execute'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:425:in `check_database_errors'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/adapters/postgres.rb:242:in `block in execute'
from /var/vcap/packages/director/gem_home/gems/sequel-3.43.0/lib/sequel/database/connecting.rb:236:in `block in synchronize'
from bosh.
Fixed the problem of the failed postgres migration with:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "REASSIGN OWNED BY postgres TO bosh"
Now have successfully deployed microbosh 703.
If the problem still persists I should manifest itself within one day.
Will report back tomorrow.
from bosh.
Can you create a ticket to add this as a migration?
Dr Nic Williams
Stark & Wayne LLC - the consultancy for Cloud Foundry
http://starkandwayne.com
+1 415 860 2185
twitter: drnic
On Tue, Jun 4, 2013 at 7:23 AM, Ruben Koster [email protected]
wrote:
Fixed the problem of the failed postgres migration with:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "REASSIGN OWNED BY postgres TO bosh"
Now have successfully deployed microbosh 703.
If the problem still persists I should manifest itself within one day.Will report back tomorrow.
Reply to this email directly or view it on GitHub:
#96 (comment)
from bosh.
Perhaps it's not possible to migrate actually. So do we need to add properties to legacy micro_bosh.yml?
Dr Nic Williams
Stark & Wayne LLC - the consultancy for Cloud Foundry
http://starkandwayne.com
+1 415 860 2185
twitter: drnic
On Tue, Jun 4, 2013 at 7:23 AM, Ruben Koster [email protected]
wrote:
Fixed the problem of the failed postgres migration with:
/var/vcap/packages/postgres/bin/psql -d bosh -U vcap -c "REASSIGN OWNED BY postgres TO bosh"
Now have successfully deployed microbosh 703.
If the problem still persists I should manifest itself within one day.Will report back tomorrow.
Reply to this email directly or view it on GitHub:
#96 (comment)
from bosh.
The regression happened as our CI system unfortunately doesn't test upgrades, just clean installs. Sorry about that, I'll make sure someone looks at fixing it.
from bosh.
Perhaps a ticket/feature to test n-1 -> n upgrades?
Dr Nic Williams
Stark & Wayne LLC - the consultancy for Cloud Foundry
http://starkandwayne.com
+1 415 860 2185
twitter: drnic
On Tue, Jun 4, 2013 at 7:30 AM, Martin Englund [email protected]
wrote:
The regression happened as our CI system unfortunately doesn't test upgrades, just clean installs. Sorry about that, I'll make sure someone looks at fixing it.
Reply to this email directly or view it on GitHub:
#96 (comment)
from bosh.
Both of those stories exist. We have the failed update bug at the top of the backlog so it will get picked up next. We have CI upgrade stories for each platform for micro and full bosh which are also prioritized in the backlog. CI improvements are the current focus of the team, so we anticipate coverage for these cases within a few weeks.
from bosh.
xoxo to the ci team!
from bosh.
@rkoster Did your agents lost the connection again?
from bosh.
The problem did not reappeared. Have tried increasing the size of the deployment and the machines were added without problems. I also don't see the connection problem anymore in the registry log.
I only see the following stacktace which does not seem to cause problems. If this stacktrace is not expected I will create an new issue for it.
Bosh::Registry::InstanceNotFound - Can't find instance `vm-bedb1b6d-1e0e-4c2a-96fb-f70eb97d5093':
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.703/lib/bosh_registry/instance_manager.rb:57:in `get_instance'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.703/lib/bosh_registry/instance_manager.rb:31:in `read_settings'
/var/vcap/packages/registry/gem_home/gems/bosh_registry-1.5.0.pre.703/lib/bosh_registry/api_controller.rb:22:in `block in <class:ApiController>'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `call'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1415:in `block in compile!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `[]'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (3 levels) in route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:960:in `route_eval'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:944:in `block (2 levels) in route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:981:in `block in process_route'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `catch'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:979:in `process_route'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:943:in `block in route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `each'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:942:in `route!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1053:in `block in dispatch!'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `block in invoke'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `catch'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1035:in `invoke'
/var/vcap/packages/registry/gem_home/gems/sinatra-1.4.2/lib/sinatra/base.rb:1050:in `dispatch!'
from bosh.
Thanks @rkoster for reporting back! Reopen the issue if the bug appears again.
from bosh.
Related Issues (20)
- monit is not able to start all workers after bumping to v276.1.0+ HOT 2
- dd-agent failure 277.0 on '=~' HOT 9
- bosh lifecycle pre-start/post-start ... not applied on raw vm reboot HOT 2
- Tasks stay queued forever HOT 16
- The tasks command should list the task creation date HOT 5
- downgrading bosh from 277.2.0 to 277.1.0 is failing with add_permanent_nats_credentials_to_vms.rb (Sequel::Migrator::Error) HOT 2
- Resizing persistent disk and vm_type fails in one single deploy execution HOT 21
- Stemcells do not contain an AMI in af-south-1 (Cape Town) and me-south-1 (Bahrain) AWS regions HOT 5
- The AMIs for us-gov-west-1 AWS region are missing HOT 2
- Collection of PRs related to integration of bosh-azure-storage-cli HOT 1
- Is an internal ca & certificate can be used instead of bosh self signed HOT 4
- When deploying bosh on Vsphere, Prompt Cleaning up rendered CPI jobs... Finished HOT 26
- `/metrics` and `/api_metrics` endpoint does not show the generic API metrics for the director's endpoints HOT 5
- Create a Jumpbox and a BOSH Director error HOT 2
- 1 of 2 post-start scripts failed. Failed Jobs: cloud_controller_ng. Successful Jobs: bosh-dns. HOT 1
- Resurrector not resurrecting unresponsive agent. HOT 7
- Multi-cpi with different iaas bosh cpi releases induce ruby package conflict HOT 2
- Default bosh generated x509 certificates have invalid 3 digits USA country code HOT 6
- Support Alibaba OSS as an external blobstore for bosh HOT 5
- Improve support for diagnostics of failed compilation: flag to preserve compilation source packages and logs HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bosh.