Comments (13)
Yep, I'd be interested in taking this issue! Thanks, I'll be in touch!
from auto_impl.
Hey, is this issue still available?
from auto_impl.
@saresend To my knowledge, no one is working on this yet. So yep, it's available! So you want to work on this? :)
You can of course always ask questions in this issue, but you can also contact me via mail (address on my GitHub profile), via Discord (on the Rust server https://discord.gg/2cr8M4D) or via Telegram (username is the same as my GitHub username). I think that often, real time messaging or at least non-public messaging is easier and more effective, if you want to get quick help.
from auto_impl.
Hi @saresend, I just wanted to check your status. Are you still working on this? But no worries: I don't want to rush you! If you have problems trying to tackle this issue, just let me know!
from auto_impl.
Hey! Yeah, I'll likely tackle it this weekend, I'm currently in university and so my availability is quite sporadic :P Expect to hear from me sometime tomorrow!
from auto_impl.
Fantastic! :)
(and yeah, who doesn't know those time problems...)
from auto_impl.
Hey, I realize that I really lack understanding of procedural macros and how they function in Rust. Do you know of any resources to get up to speed? Thanks!
from auto_impl.
from auto_impl.
@saresend No problem, that's why we offer this mentoring :)
There is actually a section in the book about proc macros. But let me quickly summarize it for you in context of this crate (more explanations never hurt, right?):
There are three different kinds of proc macros: custom derives (#[derive(Serialize)]
), function-like macros (foo!()
) and -- the one we're interested in -- custom attributes (#[auto_impl()]
). All proc macros function very similarly. The creator of the proc macro creates a function that "defines" the proc macro. This function gets one or two TokenStream
s and returns a TokenStream
. The function needs to be annotated with #[proc_macro_derive(Serialize)]
, #[proc_macro]
or #[proc_macro_attribute]
for the three kinds of proc macros, respectively. You can see this function in this crate here:
#[proc_macro_attribute]
pub fn auto_impl(args: TokenStream, input: TokenStream) -> TokenStream {
...
}
So what's a TokenStream
? This is simply a list of tokens. But what's a token? It's a very simple unit of source code. This is usually the first step of all compilers: transform the list of characters (the source code) into a list of tokens. Roughly speaking, when you add whitespace everywhere in your code where you're allowed to, source_code.split_whitespace()
is the list of tokens. An example:
trait Foo<T> {
fn foo();
}
This results in the tokens:
trait
Foo
<
T
>
{
fn
- ...
The type TokenStream
is defined in the crate proc_macro
, which -- like for example std
-- comes with the compiler. You can see the documentation here. Note the IntoIter
implementation. It is an iterator over TokenTree
s. That's where the TokenStream
is a bit different from the "list of tokens" idea: Rust already parses so called "groups". Groups are a list of tokens enclosed by either ()
, []
or {}
. Each groups is represented by one TokenTree
.
Let's take the example code above and print the actual TokenTree
s we get. Let change the lib.rs
to:
#[proc_macro_attribute]
pub fn auto_impl(args: TokenStream, input: TokenStream) -> TokenStream {
for token_tree in input {
println!("{:?}", token_tree);
}
TokenStream::new() // empty result to make it compile
}
And add an examples/test.rs
file:
use auto_impl::auto_impl;
#[auto_impl()]
trait Foo<T> {
fn foo();
}
fn main() {}
And finally say cargo build --example test
. We get this output:
Ident { ident: "trait", span: #0 bytes(43..48) }
Ident { ident: "Foo", span: #0 bytes(49..52) }
Punct { ch: '<', spacing: Alone, span: #0 bytes(52..53) }
Ident { ident: "T", span: #0 bytes(53..54) }
Punct { ch: '>', spacing: Alone, span: #0 bytes(54..55) }
Group { delimiter: Brace, stream: TokenStream [Ident { ident: "fn", span: #0 bytes(62..64) }, Ident { ident: "foo", span: #0 bytes(65..68) }, Group { delimiter: Parenthesis, stream: TokenStream [], span: #0 bytes(68..70) }, Punct { ch: ';', spacing: Alone, span: #0 bytes(70..71) }], span: #0 bytes(56..73) }
(Side note: yes, we just got our own output while executing cargo build
-- without running the actual test. That happens because our proc macro code is executed by the compiler while compiling other crates.)
So that should give you a rough idea what a token stream is.
Lastly, you probably noticed that our function takes two TokenStreams
: input
and args
. The former contains all tokens of the item our attribute is attached to (the trait). args
contains the tokens in the actual attribute. So #[auto_impl(&, Box)
would contain the tokens &
, ,
and Box
.
After the compiler called our function, it takes the token stream we returned and replaces the original trait definition with that token stream. So if we always return TokenStream::new()
, we basically just delete the trait definition. (side note: the token stream returned by custom derives is added to the original token stream. So custom derives cannot modify the definition of the item they are attached to.)
Usually you want to interpret the token stream as a Rust thing (e.g. a trait). That's what the crate syn
is for: it's a crate that can parse a token stream into an AST (abstract syntax tree). This makes it way easier to, say, iterate over all methods of a trait. That's one of the first things we do: we tell syn
to parse our token stream as trait.
This results in a ItemTrait
(this is one AST node, in our case the root node).
With that information, we can start generating our output. We don't want to modify the trait definition, but only add tokens. For each of our so called "proxy types", we need to generate one impl
block. So in #[auto_impl(&, Box)
, we have two proxy types: &
and Box
. So we will emit two impl blocks.
Let's see an example again. The example from above but with &
and Box
added (and of course, we reset the definition of fn proc_macro
in lib.rs):
use auto_impl::auto_impl;
#[auto_impl(&, Box)]
trait Foo<T> {
fn foo();
}
fn main() {}
We can now use the great cargo expand
tool to show the code generated by our macro. cargo expand --example test
shows:
#![feature(prelude_import)]
#![no_std]
#[prelude_import]
use std::prelude::v1::*;
#[macro_use]
extern crate std as std;
use auto_impl::auto_impl;
trait Foo<T> {
fn foo();
}
impl<'a, T, U: 'a + Foo<T>> Foo<T> for &'a U {
fn foo() {
U::foo()
}
}
impl<T, U: Foo<T>> Foo<T> for ::std::boxed::Box<U> {
fn foo() {
U::foo()
}
}
fn main() {}
(You can ignore the stuff at the very top, that's not from us.)
The generation of these impl blocks is what is the complicated part. All of it starts [here, in gen::gen_impls
])(
Lines 19 to 24 in d7ba83a
One additional thing: you will see the quote! {}
macro a lot in our code. It comes from this crate and it's an easy way to generate a token stream. So instead of writing creating TokenTree
s and all of that jazz yourself, you can simply write quote! { fn foo() }
and this results in a token stream that contains three token trees (ident fn
, ident foo
and group ()
).
A bit confusing: quote!
returns a proc_macro2::TokenStream
. The crate proc_macro2
is a crate very very similar to proc_macro
which is used to test stuff. Read the README for more information. The important part is, that it's used a lot. So what we do is to create proc_macro2::TokenStream
s with quote
and at the very end, we convert them back to the expected proc_macro::TokenStream
.
So I hope that serves as an introduction. For more questions, just ask in the gitter channel :)
from auto_impl.
Hey, just wanted to ask a couple of things involving how testing is being built. Specifically, since it depends on using the unstable build-plan to establish the binary target, it seems to be failing to execute the tests. I was wondering if there was a way to rely on something more stable for determining the binary target, specifically I was looking at using cargo build --tests --message-format=json
, and parsing that to discover the binary file? Thanks!
from auto_impl.
@saresend I'm not sure if you've seen that but @KodrAus noticed this problem and created rust-lang/cargo#6082. On recent nightlies, our tests fail because of that, yes. But it's already fixed upstream -- now we only have to wait until it lands in nightly.
But your idea is very interesting! Ideally, something like a compile-fail tester should be implemented in an external crate and not for every project that wants such a test. Maybe I'll look into using the --message-format=json
-- but I don't have a lot of time right now.
Some background on the current build system can be found here: #17
Lastly, if you want to test your changes now, you can use an older nightly compiler. For example nightly-2018-09-15
works:
$ rustup override set nightly-2018-09-15
$ cargo test
from auto_impl.
Oh, cool! Thanks for the heads up!
from auto_impl.
Fixed in #35 :)
from auto_impl.
Related Issues (20)
- Use impl header lifetime elision in generated impls for '&' and '&mut' once stable HOT 3
- `self` receiver and `FnMut`: cannot borrow immutable argument HOT 1
- Emit warning on empty proxy type list HOT 3
- Make `Fn*` types work with some generic methods HOT 1
- Relicense as MIT/Apache-2 HOT 4
- Newtype support HOT 9
- Add `!` (never) as proxy type HOT 3
- auto_impl for trait objects? HOT 3
- Compiler error running tests
- Make impls apply for trait objects HOT 9
- Add proper compile-fail tests HOT 2
- Investigate using `syn-mid` HOT 3
- Proposal: throw away diag.rs module and use proc-macro-error crate HOT 3
- Auto-impl trait objects? HOT 8
- Unable to use auto_impl with async-trait HOT 17
- Use `Span::mixed_site` HOT 3
- `no_std` support HOT 2
- Support mutable function arguments HOT 1
- const generics support HOT 1
- Errors when using an associated type defined in a supertrait HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from auto_impl.