fl03 / acme Goto Github PK
View Code? Open in Web Editor NEWAcme aims to be a complete auto differentiation system written in Rust.
Home Page: https://crates.io/crates/acme
License: Apache License 2.0
Acme aims to be a complete auto differentiation system written in Rust.
Home Page: https://crates.io/crates/acme
License: Apache License 2.0
For higher dimensional tensors, a special iterator is needed to correctly loop through each element;
The tensor should be able to display the captured data accordingly.
For example,
[
[a, b, c],
[d, e, f],
[g, h, i]
]
AxisIter
Acme provides several useful utilities for creating cloud-native Web applications within Rust, leveraging the native WebAssembly support and advancements within the WASI api for optimal performance, reduced overhead, and more.
The base iterator should visit the elements in logical order; meaning that an element located at [0, .., 0]
should be the first element, [0, .., 1]
would be second, and ending in [m, n]
Currently, the 'grad' macro leverages a HashMap as the means of storing computed gradients. This works until it becomes time to return the gradient array as vectorizing the HashMap is done arbitrarily.
Tensors are mathematical objects often used to describe physical properties. Considering the limitations of the Rust procedural macros, Tensors become more important as they may be used to aid in implementing an automatic differentiation suite as well as the machine-learning library, concision.
When building dynamic compute graphs, nodes are considered to be tensors while the edges represent operations. Building a sufficiently robust tensor object is fundamental when attempting to build a complete artificial intelligence and machine learning system.
It may be possible to generate a macro that represents the gradient;
extern crate acme;
use acme::operator;
fn main() {
let dx = mul_gradient!(x: 10f64);
let (x, y) = (3f64, 4f64);
let dx = mul_gradient!(x);
let dy = mul_gradient!(y);
assert_eq!(dx, y);
assert_eq!(dy, x);
}
#[operator]
pub fn mul<A, B, C>(x: A, y: B) -> C where A: core::ops::Mul<B, Output = C> {
x * y
}
An automata is formally described with the 5-tuple consisting of the elements defined below:
Currently, chained operations are dropping steps when computing the gradient with the autodiff procedural macro.
"WebRTC (Web Real-Time Communication) is a technology that enables Web applications and sites to capture and optionally stream audio and/or video media, as well as to exchange arbitrary data between browsers without requiring an intermediary. The set of standards that comprise WebRTC makes it possible to share data and perform teleconferencing peer-to-peer, without requiring that the user install plug-ins or any other third-party software." MDN
The current implementation only accounts for the two-dimensional case.
When both tensors are one-dimensional, a tensor of Rank(0) should be returned.
One of the primary resolves of the crate is to support the creation of cloud-native applications written in Rust leveraging the Axum web-framework alongside a module command line interface leveraging Clap
Currently, doing any more than a single expression results in an error.
Considering certain limitations of Rust when using procedural macros, an attribute macro will be better suited as they are permitted the full scope of whatever logic is encompassed by the function definition.
use acme::grad;
#[grad]
fn add(x: f64, y: f64) -> f64 {
x + y
}
fn main() {
println!("{:?}", add_prime(2_f64, 0));
}
Summary statistics provide quick summary of the provided information.
Each of the following methods should also be capable of being applied along a given axis; denoted with the _axis
suffix.
The autodiff
procedural macro correctly evaluates logic contained within the macro but cannot handle previously defined logic. Specifically, invoked functions and undefined method_ calls both fail to provide the system with enough information to process the contained logic.
Writing the logic for the sigmoid function within the macro works.
use acme::autodiff;
fn main() {
let (x, y) = (1_f64, 2_f64);
assert!(autodiff!(x, 1.0 / (1.0 + (-x).exp())) == 0.1049935854035065);
}
However, invoking a function (or even non-described methods) fails to evaluate correctly since the parsed input fails to store any meaningful information about the function.
use acme::autodiff;
use acme::prelude::sigmoid;
fn main() {
let (x, y) = (1_f64, 2_f64);
assert!(autodiff!(x, sigmoid(x)) != 0.1049935854035065);
}
Traditional macro libraries and methods seem to lack explicit support for engaging with these objects. At initial glance, the problem may require using lower-level Rust libraries to interact with the compiler but support for this is also limited.
One approach would be to use the span of the expression to locate the resource and extract the required information.
#[proc_macro_attribute]
#[gradient]
pub fn multiply<A, B, C>(a: A, b: B) -> C
where
A: std::ops::Mul<B, Output = C>
{
a * b
}
By writing a #[proc_macro_attribute]
any implemented functions could have a gradient function automatically generated. One of the primary issues with this is that we still run into the issue at hand when any external logic is invoked at any point within the function definition.
extern crate acme;
use acme::autodiff;
#[test]
fn test_array() {
let x = [1.0, 2.0];
let y = [2.0, 2.0];
assert_eq!(autodiff!(x: x + y), 1f64);
// panics here
// assert_eq!(autodiff!(x: x + y), [1.0, 0.0]);
}
Agents act on their environment through the use of actuators. Here, agents rely upon actors to complete their tasks.
extern crate acme;
use acme::{autodiff, operator};
fn main() {
let x = 5f64;
let (z, dz) = (sigmoid(x), sigmoid_prime(x));
assert_eq!(autodiff!(lex x: sigmoid_lexical()), dz);
}
#[operator(partial)]
pub fn sigmoid<T>(x: T) -> T where T: num::Float {
x.exp() / (T::one() + x.exp())
}
pub fn sigmoid_prime<T>(x: T) -> T where T: num::Float {
sigmoid(x) * (1 - sigmoid(x)
}
Try to integrate with the #[operator]
macro by collecting the String created by invoking _lexical()
extern crate acme;
use acme::prelude::{IntoShape, Tensor, TensorResult};
fn main() -> TensorResult<()> {
let shape = (3, 3).into_shape();
let n = shape.size();
let tensor = Tensor::linspace(0f64, n as f64, n).reshape(shape)?;
let elem = tensor[[0, 0]];
let slice = tensor[[0, ..]];
Ok(())
}
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.