material-motion-archive / direct-manipulation-swift Goto Github PK
View Code? Open in Web Editor NEWArchived February 16, 2017 :: Direct Manipulation for Apple devices
License: Apache License 2.0
Archived February 16, 2017 :: Direct Manipulation for Apple devices
License: Apache License 2.0
All references are in the unit tests.
The gesture recognizers are all let types right now. They should be vars.
We should be calculating the relative scale change between each event using division, not subtraction.
E.g.
oldScale = 1.2
newScale = 1.7
difference = 0.5
current code scale = 1 + difference = 1.5
actual amount scaled = 1.7 / 1.2 = 1.41666666666667
Note the discrepancy between the actual "amount scaled" and what we apply to the view.
The gesture recognizer may already be added to a view โ e.g. the parent view โ so we may not want to be calling target.addGestureRecognizer
ourselves in GesturePerformer.
Must be ran by a member of the @material-motion/core-team
mdm release publish 1.0.0
This will allow us to reuse a good amount of the logic.
This will allow us (and others) to reuse these classes in unit test targets.
This must be run by a @material-motion/core-team member.
mdm release cut
This is leaking the implementation details of the plan.
Instead, let's just expose the gesture recognizers. We intentionally won't expose shouldAdjustAnchorPointOnGestureStart because we always want this property to be true for DirectlyManipulable.
Every public API needs to be documented.
Can verify this by running:
jazzy
open docs/index.html
and observing the percentage at the top of the page.
jazzy
cd docs
git init
git add .
git commit -am "Initial commit of documentation"
git branch -m gh-pages
git remote add [email protected]:material-motion/material-motion-family-pop-swift.git
git push origin gh-pages
AdjustsAnchorPoint will be emittable by DirectManipulation and directly addable by clients composing directly-manipulable themselves.
Should compose out to Draggable, Pinchable, and Rotatable.
Only DirectlyManipulable should.
Rationale being that if you are using any of Pinchable/Rotatable/Draggable then it is very unlikely that you will want to modify the anchor point each time the gesture recognizer begins. This type of behavior only makes sense if you have all three plans operating in tandem (i.e. with DirectlyManipulable
).
Blocked on #20.
Two space tabs throughout the code.
Gesturable
likely doesn't need to conform to Plan because it doesn't do much on its own.
In general we should be conservative about which objects conform to Plan
.
Coverage is at 70% as of 3167a04.
All performer logic should be added to swift files in the src/private
sub-directory.
See the Core Animation family for an example: https://github.com/material-motion/material-motion-family-coreanimation-swift/tree/develop/src/private
See the ContinuousPerforming spec: https://material-motion.github.io/material-motion/starmap/specifications/runtime/Performer
One idea is to visualize the anchor point of the view and to allow you to change it by tapping anywhere in the view. We could include labels showing the view's position in order to show what's going on with the underlying .position value when we change the anchor point.
Should include the following:
Assume that this repo is the entry-point for an application developer who has never used Material Motion before. We'll likely want to link to some form of "getting started with material motion" document. Life of a plan might be a good start, but we should have a swift version as well.
Need to rename the:
UIKit's gesture recognizers include a slop region. This slop region isn't always desirable for direct manipulation scenarios.
We should explore how we might provide pinch/rotate/pan support without a slop region.
One option is to implement a new UIGestureRecognizer subclass. This means we need to implement our own velocity calculations, which poses a serious risk of deviating from UIKit's velocity calculation logic.
If we can somehow utilize the existing gesture recognizers that would be preferable.
@objc(MDM...)
name mapping.This plan is somewhat too powerful in that it allows arbitrary blocks to be provided to it. This can leak separation of concerns back to the creator of the plan.
Example:
BlockGesturable(withGestureRecognizer: gesture) { gesture in
// Dangerous: access internal state of the registering agent
}
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.