Giter Club home page Giter Club logo

spark's People

Contributors

amarseillan avatar asolntsev avatar bentolor avatar dreambrother avatar edwardraff avatar ericksanval avatar fddayan avatar fdmoulin avatar giflw avatar heuermh avatar jakaarl avatar juliengaucher avatar kliakos avatar ljsztul avatar mikegolod avatar mirosta avatar mlitcher avatar mouette-sc avatar perwendel avatar reinhard avatar rodrigovz avatar sslavic avatar stevemcleod avatar t-kameyama avatar thebigb avatar tipsy avatar travisspencer avatar tweenietomatoes avatar vedala avatar zeroflag avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spark's Issues

Java 8 plans

It would be nice to support Java 8 lambda expressions for route handlers definition. Is it possible?

static files support just like in jetty

Hi,
I spotted spark yesterday and I'm just implementing a private project with it and though I normally implement the jetty stuff myself I found the way spark handles requests is more elegant and saves me some time and lines of code so thanks for that :D.

However, I haven't found a way to tell spark which directory can be used to store static files and how to access them.

Well, that's it :). Probably easy to implement. I did it with plain jetty in all of my projects before. Basically just defining a new ResourceHandler, setting the resourceBase and adding it with setHandlers of the Server object.

Hope it helps in some way :). Keep the project going. It's great. I even played with "play" but I don't like to be jailed in any way. Spark seems like it jails me, too but in a far less way. I, at least, feel free and that's important.

Just keep it going :).

Make spark.Response's body method public

Currently the body method in spark.Response is package-private. This means that any application wishing to modify a response body using the before or after verb must be implemented in the spark package. This is annoying, especially for projects in existing package hierarchies that would like to migrate to Spark.

NoSuchMethodError: javax.servlet.http.HttpServletResponse.getHeader(String)

Hello,

When I update our build from version 1.0 to version 1.1 of spark-core, our functional tests fail with

java.lang.NoSuchMethodError: javax.servlet.http.HttpServletResponse.getHeader(Ljava/lang/String;)Ljava/lang/String;
        at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:205)
        at spark.servlet.SparkFilter.doFilter(SparkFilter.java:98)
        at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1332)
        at org.tuckey.web.filters.urlrewrite.RuleChain.handleRewrite(RuleChain.java:176)
        at org.tuckey.web.filters.urlrewrite.RuleChain.doRules(RuleChain.java:145)
        at org.tuckey.web.filters.urlrewrite.UrlRewriter.processRequest(UrlRewriter.java:92)
        at org.tuckey.web.filters.urlrewrite.UrlRewriteFilter.doFilter(UrlRewriteFilter.java:389)
        at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1332)
        at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:477)
        at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:119)
        at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:499)
        at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:227)
        at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1031)
        at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:406)
        at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:186)
        at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:965)
        at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117)
        at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:250)
        at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:149)
        at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:111)
        at org.eclipse.jetty.server.Server.handle(Server.java:348)
        at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:452)
        at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:894)
        at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:948)
        at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:851)
        at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
        at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:77)
        at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:606)
        at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:46)
        at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:603)
        at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:538)
        at java.lang.Thread.run(Thread.java:722)

The functional tests run Jetty 7.x via cargo-maven2-plugin version 1.2.2, and we use version 2.5 of the servlet-api as a provided dependency

<dependency>
  <groupId>javax.servlet</groupId>
  <artifactId>servlet-api</artifactId>
  <version>2.5</version>
  <scope>provided</scope>
</dependency>

Restart Server Automatically

I'm wondering if it would be possible for me to restart the server automatically when I edit static files. I'm doing a lot of my development in Javascript right now, and it's annoying to have to stop and restart the server every time I make a small change to the Javascript or the HTML.

If there's a workaround or a fix for this, please tell me!

Thanks!

Consider using retrofit.converter and retrofit.mimetype packages

I have recently used retrofit

http://square.github.io/retrofit/

on a couple of client-side projects and like its APIs, in particular the converter

http://square.github.io/retrofit/javadoc/retrofit/converter/package-summary.html

and mimetype packages

http://square.github.io/retrofit/javadoc/retrofit/mime/package-summary.html

I wonder if it might be possible to reuse these on the server side in spark for aid in implementing the newly added transform and template features.

Stop/restart Jetty after running HelloWorld

I didn't find any tips or answers on how to stop/restart Jetty instance after running the simple HelloWorld class. Once started, no way to rerun the same class. I just took the supplied HelloWorld example and tried to re-run it after some modification. here is what I got in Eclipse console:

log4j:WARN No appenders could be found for logger (spark.route.RouteMatcherFactory).
log4j:WARN Please initialize the log4j system properly.
== Spark has ignited ...
>> Listening on 0.0.0.0:4567
java.net.BindException: Address already in use: JVM_Bind
    at java.net.PlainSocketImpl.socketBind(Native Method)
    at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:365)
    at java.net.ServerSocket.bind(ServerSocket.java:319)
    at java.net.ServerSocket.<init>(ServerSocket.java:185)
    at java.net.ServerSocket.<init>(ServerSocket.java:141)
    at org.eclipse.jetty.server.bio.SocketConnector.newServerSocket(SocketConnector.java:86)
    at org.eclipse.jetty.server.bio.SocketConnector.open(SocketConnector.java:75)
    at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:356)
    at org.eclipse.jetty.server.bio.SocketConnector.doStart(SocketConnector.java:146)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:55)
    at org.eclipse.jetty.server.Server.doStart(Server.java:269)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:55)
    at spark.webserver.SparkServerImpl.ignite(SparkServerImpl.java:64)
    at spark.Spark$1.run(Spark.java:197)
    at java.lang.Thread.run(Thread.java:619)

There is no service running in Windows services list either. Any idea? Thanks

File upload support

Hi, thank you for your wonderful work. Is there any way to handle files being uploaded via POST?

Using staticFileLocation("/public") in a project (Using version spark-core-0.9.9.7-SNAPSHOT) deployed on an external server does not work.

Version spark-core-0.9.9.7-SNAPSHOT
Was trying to use staticFileLocation("/public") in my code as follows:

public class MyClass implements spark.servlet.SparkApplication{

@Override
public void init() {
    staticFileLocation("/public");
            ........
            ........
        }

When I generate a war for the project and deploy it on Jetty I get the following exception:
java.lang.IllegalStateException: This must be done before route mapping has begun
at spark.Spark.throwBeforeRouteMappingException(Spark.java:320)
at spark.Spark.staticFileLocation(Spark.java:144)
at com.itgssi.pmds.client.pam.MyClass.init(MyClass.java:27)
at spark.servlet.SparkFilter.init(SparkFilter.java:59)
at org.eclipse.jetty.servlet.FilterHolder.initialize(FilterHolder.java:135)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:800)

This is happening because The SparkFilter I define in my web.xml as follows:

                                                   <filter-mapping>
                                                           <filter-name>SparkFilter</filter-name>
                                                           <url-pattern>/pam</url-pattern>
                                                   </filter-mapping>

is initializing the servlet and is not waiting for the SparkApplication to Load.
Hence the servlet is initialised even before the class is loaded, causing the exception.
This needs to be resolved as one cannot deploy a class using staticFileLocation on an external webserver.

Handle binary result streams not only Strings

as response.raw().getOutputStream() throws an exception in Spark

it would be nice to be able to return an inputstream with data (e.g. from a pipedinputstream) which would then be used by filters and streamed to the client.

Custom 404 Page when using Static File Route?

When using staticFileRoute("/foo"), is there an easy way to configure a static, catch-all 404 error page? (Defining a get(new Route("*"){โ€ฆ}) at the end causes the static files to not be served.)

Creating result transformation modules mainly for RESTful architecture.

Hello,

I am using Spark as an education resource for my video tutorials. I have seen that Spark supports RESTful URL format, but does not support transforming entities to json format for example (of course this could be done inside Route's handle method), but because nowadays RESTful web architecture is really growing in popularity, and also HATEOAS architecture, I think should be great that Spark supports natively this transformation. So I decided to download the Spark code and see what can be done. Before starting coding and create a pull request I would agree if there are some philosophical to not implement as native. Let me explain what would be my approach:

The basic idea is creating a meta-inf service interface, which will be in charge of transforming an object to String representation .

So instead of having:

bodyContent = result.toString();

we can have something like:

if(serviceRegistered()) {
  bodyContent = getService().transform(result);
} else {
  bodyContent = result.toString();
}

I think that this approach gives the possibility to maintain Spark as lightweight web container, but also giving the opportunity to be extended. Then we can create some projects like spark-json which is a jar that transforms the object to json representation, spark-xml which transforms objects to xml and so on.

What do you think? Has it sense this approach? If there is no inconvenience I can implement it and having a version quickly.

Alex.

bad file descriptor when started with "nohup"

Hi,
I've normally run my jetty based applications with "nohup" so they stay started even when I'm logged out. However, with Spark it's not possible:

java.io.IOException: Bad file descriptor
    at java.io.FileInputStream.readBytes(Native Method)
    at java.io.FileInputStream.read(FileInputStream.java:236)
    at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
    at java.io.BufferedInputStream.read(BufferedInputStream.java:254)
    at spark.webserver.SparkServerImpl.ignite(SparkServerImpl.java:65)
    at spark.Spark$1.run(Spark.java:176)
    at java.lang.Thread.run(Thread.java:636)

I guess it want to something with stdin which is not available then which is why it fails.

I don't know any other easy way of demonizing it right now.

javax/servlet/Filter ClassNotFound when Spark.get

I am using

Spark.get(new Route("/")

Which calls

server = SparkServerFactory.create();

And I am getting exception

Exception in thread "Thread-1" java.lang.NoClassDefFoundError: javax/servlet/Filter
at spark.Spark$1.run(Spark.java:196)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.ClassNotFoundException: javax.servlet.Filter
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 2 more

Implement HTTP 303 redirect

Please add a method to your Response class that implements HTTP status 303 (See other) redirects. More details on the HTTP status code are available here - http://en.wikipedia.org/wiki/HTTP_303

303 is rather handy when someone tries to POST data that's already been submitted. It's also useful for asynchronous calls.

Use byte[] instead of String for Request.body as well as Response

It's not clear how one is supposed to handle arbitrary binary data inside Spark, since String is used both for request input, and for response output.
If there IS a reliable way in Java to convert between String and arbitrary byte[], then I can't find what it is. I suggest that if such a method exists, then it should be made clear in the docs how to do it.
If not, then I suggest using byte[] in place of String, both for Request.body(), Response.body(), and for all the return phases of the pipeline.

Feature Request: Support for filename as format

Coming from rails I would very much like to be able to define a route according to this format: new Route("/:file.:format")or even just new Route("/:file.json"), and be able to read those variables!

Random static files availability

I'm just starting to tinker with Spark 1.1, and this strange behaviour is happening that i cannot seem to understand.

I have tried both staticFileLocation("/resources");
and externalStaticFileLocation("fullpath");

When i try to access the files directly in their full path, e.g http://localhost/img/someimage.jpg, it loads every time,
although the data seems chunked, as randomly only 3/4's of the image is loaded...

But when referenced by a response body (in html) at http://localhost they appear and disappear at random on every pageload.

Sometimes the resources are accessible, sometimes they are not.

I really have no idea what may be causing this.

Splat support in routes

I'm not a Ruby guy, I'm a Java guy. I have no experience with Sinatra, but I see that Sinatra supports extracting wildcard values from the url as a splat array. That would be a fantastic addition (for me) to your Spark framework. I currently struggle with other frameworks' limitations in handling routing and url encoded paths correctly. Especially when combined with servlet containers that are openly hostile to url encoded paths (I'm talking about you, Tomcat).

I want to be able to match a route and extract a path subset from a url like the following:
/:project:/repo/tree/:ref/*

I was pleased to discover, after experimenting, that you can mix variables and static paths, as in the above example, in Spark - that wasn't immediately obvious from the unit tests nor the samples.

So, in short, being able to easily grab splat[0], splat[1], etc would be tremendous!
I hope it is on your roadmap.

Thanks for your consideration.

Deploying to a server

I'm pretty new to Java web development and I'm having a lot of trouble deploying my application to a remote Tomcat server. The readme on the Java Spark website isn't helping too much. I've got a pretty complicated application built right now that runs fine in standalone, but is there anyway someone could post a complete, configured sample application that can be deployed on a remote server?

Documentation including links to sample applications

Would it make sense to update the documentation with links to full featured apps developed in Spark?

I have no problem, for instance, with the lack of a template engine. However, it would be helpful to have some models to look at.

Specifically, I'm thinking of examples of apps serving dynamic / embedded content (e.g. Sinatra's "views").

Switch to using 1.6 Java rather than 1.7

Hi,

Any chance we could reduce the source and target Java version to (at least) 1.6? The code currently only uses the diamond operator in two places and so doesn't strictly need 1.7 compatibility (you just need to explicitly select the generic type arguments).

S.

Further to Content-Type issues in MatcherFilter

I am having great success running Spark with Tomcat; I've been really appreciating your clean minimalistic framework.

However, I have hit an issue with Spark handling of Content-Type, both in versions 1.0 and the fix in 1.1.

The current behaviour of Spark's MatcherFilter is to populate the returning "Content-Type" using the "Accept" header from the request, if null:

if (httpResponse.getHeader(CONTENT_TYPE_RESPONSE_HEADER) == null) {
    httpResponse.setHeader(CONTENT_TYPE_RESPONSE_HEADER, acceptType);
}

Typical browser "Accept" headers are:

text/html, application/xhtml+xml, */*

or

text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8

These are not valid values for the HTTP "Content-Type" field. Internet Explorer in particular has issue with such values and will force a file download, rather than displaying the webpage. You can try this out on your HelloWorld example with IE10.

In addition, Tomcat does not allow the "Content-Type" field to be populated via the Response.setHeader(String) method, there is code specifically to prevent this:

// XXX Eliminate redundant fields !!!
// ( both header and in special fields )
if( name.equalsIgnoreCase( "Content-Type" ) ) {
    setContentType( value ); return true;
}

Tomcat mandates the use of the explicit setContentType(String) method. Therefore the Content-Type fix for version 1.1 doesn't work in Tomcat, and also doesn't work well in Internet Explorer.

Ideally I'd like to see the code changed to:

if (null == httpResponse.getContentType()) {
    httpResponse.setContentType("text/html; charset=utf-8");
}
httpResponse.getOutputStream().write(bodyContent.getBytes("utf-8"));

Which would be in keeping with outputting UTF-8 via the OutputStream. In any case you need to use setContentType(String) in the ServletResponse rather than setHeader(String).

Apologies for the verbosity. Thank you for your attention.

setting params with capital letters

when setting a route with :

get(new Route("/myroute/:myParam"))

A call to

request.params(":myParam")

returns null.

Looks like the capital letter in myParam is the cause of the issue.
Might consider to lowerCase() the pram names

error set contenttype

when is set contentType in my route, spark will reset at

MatcherFilter.java line 203

httpResponse.addHeader("Content-Type", acceptType);

maybe ,change to more better

if(httpResponse.getContentType()==null) {
    httpResponse.addHeader("Content-Type", acceptType);
}

java.io.IOException: Closed while trying to send binary content in response

Following code

@OverRide
public Object handle(Request request, Response response) {
byte[] result = createChartImage(findByBMetric);
response.status(200);
response.type(JPG_CONTENT_TYPE);
final HttpServletResponse rawResponse = response.raw();
rawResponse.setContentType("image/jpeg");
rawResponse.setContentLength(result.length);
final ServletOutputStream outputStream = rawResponse
.getOutputStream();

        outputStream.write(result);
        return result;

}

works but produces warning in console

2012-05-03 14:00:49,323 WARN - log - /stabilitygraph?metric=ru.selenium.express.web.CheckMessaging.clearRecentMessagesUser2: java.io.IOException: Closed

Possibly I writing binary content in a wrong way, please show me the correct one.

Package 0.9.9.5-SNAPSHOT for Maven

Hi. I wanted to use 0.9.9.5-SNAPSHOT due to the addition of cookies, but it's not in your maven repository. Could it please be added?

This is a great project BTW, thank you for creating it.

Thank you.

64kB output limit

It seems that Spark is only putting out the first 64kB of my response (which I return from the route handler as a huge String).

Perhaps the limitations of some OutputStreams are at issue, since PrintWriters and so on can use those internally.

This might be alleviated by accepting CharSequences (and converting everything else to one) and then using the subSequence method and then the toString method on the subsequence alone in order to get small enough chunks to send properly. Just converting everything to a String instead of leaving it as a CharSequence should be avoided to keep from potentially using up system memory when serving huge files.

Template engine for render views

Hi Wendel,

First, congrats for this great project.

How to render a template source like Sinatra? In a Sinatra code I use the following instructions:

get "/" do
  erb :index
end

The template for index page can be find (read) in views (default) directory.

In the Spark Java, maybe use the follow syntax:

get(new Route("/") {
  @Override
  public Object handle(Request request, Response response) {
    return jsp("index");
  }
});

Ryan Tomayko (@rtomayko) develop the Tilt, a template handler for Sinatra.

Request.url() returns null when HTTP PUT request is made

When I deploy my Spark application in it's own container as opposed to using embedded Jetty and have code that calls Request.url() - the string null is returned along with the rest of the URL. This occurs when a PUT request is made.

Tests depend on the order of execution

running "mvn test" I see that at least on my installation the testGetBook() test of BooksIntegrationTest is run before testCreateBook(), which means that the "id" field is not yet set. Consequently, the test fails to GET "/books/null".

From what I understand it is not recommended to rely on the execution order of tests. Instead it would be possible to essentially just insert a call to testCreateBook() from inside testGetBook(). This would of course create two books over the course of the test, but it would also mean that the test runs to completion.

Concurrency (Jetty thread pool?) behaving strange. Serial behavior.

When running:

Spark.get(new Route("/hi") {
     @Override
     public Object handle(Request request, Response response) {
         try {
             Thread.sleep(5000);
         } catch (InterruptedException e) {
             e.printStackTrace();
         }
         return "Thread: " + Thread.currentThread();
     }
});

and doing two HTTP requests at the same time one would expect both to be executed in a concurrent way but instead the are executed in a serial manner.

I tried to change the Jetty threadpool from QueuedThreadPool to ExecutorThreadPool to see if this had any effect but it didn't.

This is a pretty serious bug and I would be grateful to get help from anyone with any idea why this is happening?

Java 6 compatibility

Hello there,

I was wondering if you have any plans to maintain a Java 6 branch, since there are plenty of people who can't (or won't) upgrade to 7 just yet.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.