Giter Club home page Giter Club logo

protobuf's Introduction

pure-protobuf

GitHub Workflow Status Code coverage PyPI - Downloads PyPI – Version PyPI – Python License

Wow! Such annotated! Very buffers!

Documentation

Documentation

Quick examples

.proto definition

It's not needed for pure-protobuf, but for the sake of an example, let's consider the following definition:

syntax = "proto3";

message SearchRequest {
  string query = 1;
  int32 page_number = 2;
  int32 result_per_page = 3;
}

And here's the same via pure-protobuf:

from dataclasses import dataclass
from io import BytesIO

from pure_protobuf.annotations import Field
from pure_protobuf.message import BaseMessage
from typing_extensions import Annotated


@dataclass
class SearchRequest(BaseMessage):
    query: Annotated[str, Field(1)] = ""
    page_number: Annotated[int, Field(2)] = 0
    result_per_page: Annotated[int, Field(3)] = 0


request = SearchRequest(query="hello", page_number=1, result_per_page=10)
buffer = bytes(request)
assert buffer == b"\x0A\x05hello\x10\x01\x18\x0A"
assert SearchRequest.read_from(BytesIO(buffer)) == request
from io import BytesIO

from pure_protobuf.annotations import Field
from pure_protobuf.message import BaseMessage
from pydantic import BaseModel
from typing_extensions import Annotated


class SearchRequest(BaseMessage, BaseModel):
    query: Annotated[str, Field(1)] = ""
    page_number: Annotated[int, Field(2)] = 0
    result_per_page: Annotated[int, Field(3)] = 0


request = SearchRequest(query="hello", page_number=1, result_per_page=10)
buffer = bytes(request)
assert buffer == b"\x0A\x05hello\x10\x01\x18\x0A"
assert SearchRequest.read_from(BytesIO(buffer)) == request

protobuf's People

Contributors

bbayles avatar d70-t avatar dependabot[bot] avatar eigenein avatar mokto avatar ottergottaott avatar renovate[bot] avatar shibotto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protobuf's Issues

Self-referencing messages

Hi, I'm trying to figure out if it's possible to make a @message class referencing itself, like this:

message Box {
  Box box = 1;
  value int32 = 2;
}

The straightforward way didn't work:

@message
@dataclass
class Box:
    box: Optional['Box'] = field(1)
    value: int = field(2)
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
File ~/tmp-venv-p396/lib/python3.9/site-packages/pure_protobuf/dataclasses_.py:202, in make_field(number, name, type_, packed)
    201 try:
--> 202     serializer = SERIALIZERS[type_]
    203 except KeyError as e:

KeyError: <class '__main__.Box'>

The above exception was the direct cause of the following exception:

TypeError                                 Traceback (most recent call last)
Input In [72], in <module>
      1 @message
      2 @dataclass
----> 3 class Box:
      4     box: Optional['Box'] = field(1)
      5     value: int = field(2)

File ~/tmp-venv-p396/lib/python3.9/site-packages/pure_protobuf/dataclasses_.py:143, in message(cls)
    141         casted_cls.__protobuf_fields__.update(children)
    142     else:
--> 143         num, proto_field = make_field(field_.metadata['number'],
    144                                       field_.name,
    145                                       type_hints[field_.name],
    146                                       field_.metadata['packed'])
    147         casted_cls.__protobuf_fields__[num] = proto_field
    149 Message.register(cls)  # type: ignore

File ~/tmp-venv-p396/lib/python3.9/site-packages/pure_protobuf/dataclasses_.py:204, in make_field(number, name, type_, packed)
    202         serializer = SERIALIZERS[type_]
    203     except KeyError as e:
--> 204         raise TypeError(f'type is not serializable: {type_}') from e
    206 if not is_repeated:
    207     # Non-repeated field.
    208     return number, NonRepeatedField(number, name, serializer, is_optional)

TypeError: type is not serializable: <class '__main__.Box'>

Performance related bench marks for Serialization an Deserialization

This is more of a question:
I was trying to do a benchmark for my upcoming project between JSON vs. proto Vs. Avro data formats:
I am using orjson for Json Serialization and de-serialization, for ProtoBuf I am using pure-protobuf
Here's what the code looks like:

Data classes:

from faker import Faker
from dataclasses import dataclass
from pure_protobuf.dataclasses_ import field, message

Faker.seed(0)
fake = Faker()


@message
@dataclass
class Head():
    msgId: str = field(1)
    msgCode: str = field(2)
    guid: str = field(3)
    src: str = field(4)
    ts: int = field(5)

    @staticmethod
    def fakeMe():
        return Head(fake.md5(),
                fake.pystr(min_chars=5, max_chars=5),
                fake.ean(length=13),
                fake.pystr(min_chars=1, max_chars=1),
                int(time()*1000)
            )

@message
@dataclass
class Message():
    head: Head = field(1)
    # data: Data = field(2)
    status: bool = field(2)

    def fakeMe(self):
        self.head = Head.fakeMe()
        # self.data = Data.fakeMe()
        self.bool = fake.pybool()
        return self

Running Serialization and Deserialization:

import time, sys, orjson, message_pb2
from object_gen import create_dummy_obj
from dto.device_message import Message          # this is my data calss

def measure_serialize_deserialize(obj, format):
  ser_fun = ser_obj.get(format)

  deser_fun = deser_obj.get(format)

  # serialize and measure time
  start_time = time.time()
  ser_data = ser_fun(obj)
  time_taken_ser = time.time() - start_time
  mem_ser = sys.getsizeof(ser_data)

  # deserialize and measure time
  start_time = time.time()
  deser_data = deser_fun(ser_data, Message)
  time_taken_deser = time.time() - start_time

  return (time_taken_ser, time_taken_deser, mem_ser)


def serialize_json(obj):
  return orjson.dumps(obj)



def deserialize_json(byteArr, klass):
  return orjson.loads(byteArr)


def serialize_proto(obj):
  return obj.dumps()


def deserialize_proto(byteArr, klass):
  return klass.loads(byteArr)


def serialize_avro(obj):
  pass

def deserialize_avro(byteArr, klass):
  pass


ser_obj = {
  "J": serialize_json,
  "P": serialize_proto,
  "A": serialize_avro,
}

deser_obj = {
  "J": deserialize_json,
  "P": deserialize_proto,
  "A": deserialize_avro,
}


def runBenchMarks(numberOfMsgs, format):

  ser_times = []
  deser_times = []
  memory_usage_plain = []
  memory_usage_ser = []

  for i in range(1, numberOfMsgs + 1):
    # create new object basis on the format.
    obj = create_dummy_obj(format)
    memory_usage_plain.append(sys.getsizeof(obj))
    ser_time, deser_time, mem_ser =  measure_serialize_deserialize(obj, format)
    ser_times.append(ser_time)
    deser_times.append(deser_time)
    memory_usage_ser.append(mem_ser)

  # return values
  return ser_times, deser_times, memory_usage_plain, memory_usage_ser

After running the program for 1000 messages using pure-proto I found:

Running benchmark for 1000 samples and format = P


 =========== Serialization METRICES (Time in ms) ====================
Total Time taken for serialization: 15.65241813659668
Avg Time taken for serialization: 0.01565241813659668
Min Time taken for serialization: 0.014781951904296875
Max Time taken for serialization: 0.04220008850097656


 =========== Deserialization METRICES (Time in ms) ====================
Total Time taken for deserialization: 21.908044815063477
Avg Time taken for deserialization: 0.021908044815063477
Min Time taken for deserialization: 0.0209808349609375
Max Time taken for deserialization: 0.051975250244140625


 =========== MEMORY METRICES (Bytes) ====================
Total memory utilized by Plain objects: 103000
Avg memory utilized by Plain objects: 103.0
Min memory utilized: 103
Max memory utilized: 103

Total memory utilized by serialized objects: 103000
Avg memory utilized by serialized objects: 103.0
Min memory utilized: 103
Max memory utilized: 103

Then I ran the same code for JSON:

Running benchmark for 1000 samples and format = J


 =========== Serialization METRICES (Time in ms) ====================
Total Time taken for serialization: 0.9558200836181641
Avg Time taken for serialization: 0.0009558200836181642
Min Time taken for serialization: 0.0
Max Time taken for serialization: 0.20194053649902344


 =========== Deserialization METRICES (Time in ms) ====================
Total Time taken for deserialization: 1.4314651489257812
Avg Time taken for deserialization: 0.0014314651489257812
Min Time taken for deserialization: 0.0007152557373046875
Max Time taken for deserialization: 0.029087066650390625


 =========== MEMORY METRICES (Bytes) ====================
Total memory utilized by Plain objects: 182518
Avg memory utilized by Plain objects: 182.518
Min memory utilized: 182
Max memory utilized: 183

Total memory utilized by serialized objects: 182518
Avg memory utilized by serialized objects: 182.518
Min memory utilized: 182
Max memory utilized: 183

If you seen total time for serialization and Deserialization its way less than compared to JSON. Ideally this should not be the case if I am correct and understand protobuf correctly. Could it be because we are compiling proto schema everytime we are calling obj.dumps() ?

Using example code results in mypy complaining about `Type[SearchRequest]` has no attribute `loads` nor `dumps`

Firstly, great job with this package I really disliked using vanilla protobuf poor type hinted code, using dataclasses makes my life wayy easier.

Issue

Using the example code in the readme makes mypy complain about missing loads and dumps method in Type[SearchRequest], even though the code is functional and both methods exists in runtime, example code:

from dataclasses import dataclass

from pure_protobuf.dataclasses_ import field, message
from pure_protobuf.types import int32


@message
@dataclass
class SearchRequest:
    query: str = field(1, default="")
    page_number: int32 = field(2, default=int32(0))
    result_per_page: int32 = field(3, default=int32(0))


obj = SearchRequest(
    query="hello",
    page_number=int32(1),
    result_per_page=int32(10),
)
serialized = obj.dumps()
deserialized = SearchRequest.loads(serialized)
print(f"{serialized=}")
print(f"{deserialized=}")

output:

serialized=b'\n\x05hello\x10\x01\x18\n'
deserialized=SearchRequest(query='hello', page_number=1, result_per_page=10)

mypy output:

example.py
example.py:3: error: Skipping analyzing "pure_protobuf.dataclasses_": module is installed, but missing library stubs or py.typed marker  [import]
example.py:3: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
example.py:4: error: Skipping analyzing "pure_protobuf.types": module is installed, but missing library stubs or py.typed marker  [import]
example.py:20: error: "SearchRequest" has no attribute "dumps"  [attr-defined]
example.py:21: error: "Type[SearchRequest]" has no attribute "loads"  [attr-defined]
Found 4 errors in 1 file (checked 1 source file)

workaround

Though a little bit repetitive, it is possible to use the base class super method + getattr to explicit define a dumps and loads method:

@message
@dataclass
class SearchRequest:
    query: str = field(1, default="")
    page_number: int32 = field(2, default=int32(0))
    result_per_page: int32 = field(3, default=int32(0))

    def dumps(self) -> bytes:
        return getattr(super(), "dumps")()

    @classmethod
    def loads(cls, data: bytes) -> "SearchRequest":
        return getattr(super(), "loads")(data)

Versions info

  • python: 3.10.8
  • mypy: 0.991 (compiled: yes)
  • pure-protobuf: 2.2.0
  • os: Linux 5.15.85-1-MANJARO

OneOf support

Any plans for this? Syntactically, I think something like this makes sense:

from dataclasses import dataclass
from typing import Union

from pure_protobuf.dataclasses_ import field, message, oneof
from pure_protobuf.types import double, uint32, uint64


@message
@dataclass
class MessageWithOneofs:
    message_data: Union[uint64, double] = oneof()
    message_id: uint32 = field(1)
    message_desc: str = field(2)
    counter: uint64 = field(3, oneof="message_data")
    value: uint64 = field(4, oneof="message_data")

...However, I'm assuming that breaks something in how you parse fields.

BytesSerializer don't accept bytes-like objects

Hi,

The current BytesSerializer doesn't allow bytes-like objects to be passed, even though the io.write natively supports it. Is there a reason for this?

class BytesSerializer(Serializer):
"""
Serializes a byte string.
See also: https://developers.google.com/protocol-buffers/docs/encoding#strings
"""
wire_type = WireType.BYTES
def validate(self, value: Any):
if not isinstance(value, bytes):
raise ValueError('a byte string is expected')
def dump(self, value: Any, io: IO):
unsigned_varint_serializer.dump(len(value), io)
io.write(value)
def load(self, io: IO) -> Any:
length = unsigned_varint_serializer.load(io)
return io.read(length)

I think changing the current validate function to something like:

def validate(self, value: Any): 
  if not isinstance(value, (bytes, bytearray, memoryview)):
    try:
      memoryview(value)
    except TypeError as exc:
      raise ValueError('a byte string is expected') from exc

Should allow bytes-like objects to work transparently. If that's fine with you, I could open a pull request.

Support forward references

Test case

diff --git a/pure_protobuf/message.py b/pure_protobuf/message.py
index 81813ca..acc0b1e 100644
--- a/pure_protobuf/message.py
+++ b/pure_protobuf/message.py
@@ -53,6 +53,7 @@ class BaseMessage(ABC):
         cls.__PROTOBUF_FIELDS_BY_NAME__ = {}
 
         type_hints: Dict[str, Any] = get_annotations(cls, eval_str=True)
+
         for name, hint in type_hints.items():
             descriptor = _FieldDescriptor.from_attribute(cls, hint)
             if descriptor is not None:
diff --git a/tests/definitions.py b/tests/definitions.py
index 4468da0..a1d9053 100644
--- a/tests/definitions.py
+++ b/tests/definitions.py
@@ -21,3 +21,13 @@ class ExampleEnum(IntEnum):
 class RecursiveMessage(BaseMessage):
     payload: Annotated[uint, Field(1)]
     inner: Annotated[Optional[Self], Field(2)] = None
+
+
+@dataclass
+class CyclicMessageA(BaseMessage):
+    inner: Annotated[Optional["CyclicMessageB"], Field(1)]
+
+
+@dataclass
+class CyclicMessageB(BaseMessage):
+    inner: Annotated[Optional["CyclicMessageA"], Field(1)]
diff --git a/tests/descriptors/test_field.py b/tests/descriptors/test_field.py
index f6423cf..58247bd 100644
--- a/tests/descriptors/test_field.py
+++ b/tests/descriptors/test_field.py
@@ -9,7 +9,7 @@ from pure_protobuf.exceptions import IncorrectAnnotationError
 from pure_protobuf.io.wrappers import to_bytes
 from pure_protobuf.message import BaseMessage
 from tests import pytest_test_id
-from tests.definitions import ExampleEnum, RecursiveMessage
+from tests.definitions import ExampleEnum, RecursiveMessage, CyclicMessageA, CyclicMessageB
 
 
 @mark.parametrize("hint", [int, Annotated[int, ...]])
@@ -45,7 +45,7 @@ def test_from_inner_hint_incorrect(hint: Any) -> None:
             b"\xd2\x02\x06\x08\x01\x12\x02\x08\x02",
         ),
         (Annotated[List[uint], Field(1, packed=False)], [1, 2], b"\x08\x01\x08\x02"),
-        # TODO: what about messages with cyclic dependencies?
+        (Annotated[CyclicMessageA, Field(2)], CyclicMessageA(CyclicMessageB(None)), b""),
     ],
     ids=pytest_test_id,
 )

Code generator

For 3rd-party APIs people often have messages defined via *.proto files which are "ground truth". It requires additional work to keep Python data classes in sync with *.proto. Let's a make a "code generator" which translates *.proto files into *.py modules

Incorrect encoding of int32 and int64 if communicating with google protobuf int32 and int64

I found a problem in the encoding of int32 and int64.
In the type definition, int32 and int64 is set to uint32 and uint64.
This is not correct, because it doesn't allow negative values for intN.

According to the 'Protocol Buffers' encoding of signed integers, intN values need to be encoded as two's complement.

Therefore, a new type definition of intN would be necessary, to guarantee the usage of negative intN values and also to guarantee the compatibility with standard google protobuf encoding.

I've created a small patch, which should solve the encoding problem for intN values.

import struct

from google.protobuf import message
from itertools import count
from pure_protobuf import types as pureprototype
from pure_protobuf.dataclasses_ import SERIALIZERS
from pure_protobuf.serializers import Serializer
from pure_protobuf.enums import WireType
from pure_protobuf.io_ import IO
from pure_protobuf.serializers import signed_varint_serializer
from typing import NewType, Any, TypeVar, Type

pureprototype.int32 = NewType("int32", int)
pureprototype.int64 = NewType("int64", int)

T = TypeVar('T')


class IntNSerializer(Serializer):
    wire_type = WireType.VARINT
    local_int2byte = struct.Struct('>B').pack

    def __init__(self, sign_bit: int, mask: int, int_n: Type[T], int_n_range: int):
        self.sign_bit = sign_bit
        self.mask = mask,
        self.int_n = int_n
        self.int_n_range = int_n_range

    def validate(self, value: Any):
        signed_varint_serializer.validate(value)
        if not -self.int_n_range <= value <= self.int_n_range:
            raise ValueError("value is out of  integer range")

    def dump(self, value: Any, io: IO):
        if value < 0:
            value += (1 << 64)
        bits = value & 0x7f
        value >>= 7
        while value:
            io.write(self.local_int2byte(0x80 | bits))
            bits = value & 0x7f
            value >>= 7
        return io.write(self.local_int2byte(bits))

    def load(self, io: IO) -> Any:
        value = 0
        for shift in count(0, 7):
            (byte,) = io.read(1)
            value |= (byte & 0x7F) << shift
            if not byte & 0x80:
                value &= self.mask
                value = (value ^ self.sign_bit) - self.sign_bit
                return self.int_n(value)
            shift += 7
            if shift >= 64:
                raise message.DecodeError('Too many bytes when decoding varint.')
        return value


SERIALIZERS.update(
    {
        pureprototype.int32: IntNSerializer(
            sign_bit=1 << (32 - 1),
            mask=(1 << 32) - 1,
            int_n=pureprototype.int32,
            int_n_range=0x7FFFFFFF
        )
    }
)
SERIALIZERS.update(
    {
        pureprototype.int64: IntNSerializer(
            sign_bit=1 << (64 - 1),
            mask=(1 << 64) - 1,
            int_n=pureprototype.int64,
            int_n_range=0x7FFFFFFF_FFFFFFFF,
        )
    }
)

It would be good, if you can have a look into this patch and provide the IntNSerializer in one of the upcoming pure-protobuf releases.

Thanks a lot!

Question: relationship with gRPC

Just found your project, looks really cool.

I am wondering, is there an intention to do so, or does this library already allow one to create a grpc server using the @message decorated dataclasses?

Is there a person that is doing this currently that could show an example, if that is the case?

Shorter syntax for optional field

In proto3, every fields are optional. But in Python code, to declare field as optional, I have to write long lines of code:

def optional_field(number: int) -> dataclasses.Field:
    return field(number, default=None)

@message
@dataclass
class DeviceToServerMessage:
    parking_report: Optional[DeviceDataRequest] = optional_field(1)
    firmware_update_check: Optional[FirmwareUpdateRequest] = optional_field(2)
    reg_request: Optional[DeviceRegistrationRequest] = optional_field(3)
    auth_request: Optional[DeviceAuthenticationRequest] = optional_field(4)
    # For mesh network
    mesh_src: Optional[str] = optional_field(13)

Not only do I have to wrap type with Optional, but also do I have to pass default=None to field() call.

Do you have any idea to make it shorter and look not duplicate?

Move out well-known type serializers to a separate module

Includes:

  • DateTimeSerializer
  • SERIALIZERS constant

I'd like to separate types and serializers. And maybe, to put well-known types in a separate sub-package (like pure_protobuf.types.google?), see also #41.

And maybe make serializers a package with standard and google inside.

Empty list of proto seems to be broken

If I define a proto containing a list of builtins, everything is fine, if I create a proto containing a list of protos then loading seems to fails (although it can the generated bytes that are wrong too)

from dataclasses import dataclass
from typing import List
from pure_protobuf.dataclasses_ import field, message


@message
@dataclass
class MyData:
    a: float = field(1)


@message
@dataclass
class ManyThings:
    mylist: List[float] = field(1)
    myemptylist: List[float] = field(2)


@message
@dataclass
class ManyThings2:
    mydatalist: List[MyData] = field(1)
    myemptydatalist: List[MyData] = field(2)


if __name__ == "__main__":
    d = MyData(1.0)
    m = ManyThings([1.0], [])
    mybytes = m.dumps()
    res = ManyThings.loads(mybytes)
    print(res)
    m2 = ManyThings2([d, d], [])
    mybytes2 = m2.dumps()
    res = ManyThings2.loads(mybytes2)
    print(res)

result:

ManyThings(mylist=[1.0], myemptylist=[])
Traceback (most recent call last):
  File "/home/olivier/kurant/dev/ipc/tt/t2.py", line 34, in <module>
    res = ManyThings2.loads(mybytes2)
  File "/home/olivier/.virtualenvs/grab/lib/python3.10/site-packages/pure_protobuf/dataclasses_.py", line 111, in loads
    return load(cls, io)
  File "/home/olivier/.virtualenvs/grab/lib/python3.10/site-packages/pure_protobuf/dataclasses_.py", line 103, in load
    return cls.serializer.load(io)
  File "/home/olivier/.virtualenvs/grab/lib/python3.10/site-packages/pure_protobuf/serializers/__init__.py", line 438, in load
    return self.type_(**values)
TypeError: ManyThings2.__init__() missing 1 required positional argument: 'myemptydatalist'

Zeroed fields by default for proto3?

I noticed a part of what was brought up in #63 wasn't really discussed, namely the fact that proto3 doesn't have the concept of "nullable". Any field not specified is left at whatever is considered to be "zero" for that particular type, and left out of the message. This clashes with how the library currently behaves, where initializing a message without specifying every field results in an error.

This would remove the need for Optional hints and the default keyword (although that would still be a nice-to-have) in definitions, while of course sacrificing compatibility with proto2.

serializing unpacked repeated fields

Thank you for this nice protobuf implementation 🎉.

Unfortunately, I've got a use-case (writing IPFS Unixfsv1 data) where I need to serialize repeated fields in unpacked format.

I discovered that there's already the UnpackedRepeatedField serializer, but I didn't see a way of defining messages such that they serialize using that serializer.
I've got a proof of concept implementation which would enable to write packed=False like:

@message
@dataclass
class Message:
    foo: List[int32] = field(1, default_factory=list, packed=False)

to create messages in unpacked representation. However I don't yet know if this would be the right way to do it.

Convert to dict

I'm using pure_protobuf in a Django project where I want to support both REST API and WebSocket+Protobuf.

I want to reuse DjangoRestFramework serializers to validate the data before saving to database. It expects input to be dict. So it will be nice if pure_protobuf support to convert the PB3 message to dict.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.