Giter Club home page Giter Club logo

helit's Introduction

Helit

This is basically a dumping ground for the code that I develop for my research. I am a firm believer that all source code must be published, for the exact same argument that you must publish. Saying this I often omit the code that generated a papers results, simply because I don't have the bandwidth to distribute the data sets or the time to clean up what is often a horrific mess. Regardless, the actual algorithm is always available for people to run their own experiments with, and if you request testing code from me I'll probably provide it under the condition that your not allowed to laugh. It is all implemented using Python, but some modules also contain C/C++ code for the bits that need to be fast, either inline via scipy.weave or as a proper Python modules that need to be compiled. In both cases as long as Python has access to a compiler the code should be automatically compiled when first run (this is reliable on Linux, but can take some setup/fiddling before it works on Mac/Windows). A few modules can also be installed via the standard setup.py mechanism. You may find my personal website at http://thaines.com

I tend to use a variety of licenses, typically BSD, Apache and GPL - check the headers. Utility stuff is usually BSD, implementations of other peoples algorithms Apache (Roughly summarised as BSD but with a longer license!) and my own research code Apache or GPL, depending on mood/project constraints. Its mostly Apache 2. I do licensing on a directory by directory basis - you will not find differently licensed files within a single directory, and the dependencies between directories obviously follow the relevant requirements. If you have good cause to request the code in another license feel free to ask - its not like I am attempting to make money out of this, so if its reasonable I will say yes.

It should be noted that I develop exclusively on 64 bit Linux, but being Python most of this code should just work on other platforms. scipy and OpenCV (multiple versions!) are also used - most if not all modules use scipy, whilst OpenCV is mostly used by the test scripts rather than the modules themselves. Installing either of these libraries should be easy on mainstream platforms. scipy.weave is used by some modules, and is a little tricky to install, as it requires a C++ compiler to be available. Whilst it works out of the box on most Linux installs, assuming you have a compiler, it can require some effort to get working elsewhere. Its not particularly hard under Windows - just google the instructions, but I have heard that it is a total pita if you are using a mac. You might want to investigate running Linux inside a virtual machine instead (I've successfully done this with virtual box - http://www.virtualbox.org/), as it is just easier. Whilst I'm 90% sure that the code should work on 32 bit operating systems there is a risk that I have missed something, as I never test on something so antiquated. Also, I use soft links in some cases, which I don't think Windows supports - easiest solution is to duplicate the data; alternatively you can add the root directory to Python's path environment variable. Or if the code supports it, just install it with setup.py

This code is primarily for research, which is to say it is not of industrial standards. Some of it is sufficiently neat and robust that it would probably pass the needed testing regime however, once you slapped some sanity checking code in. Some of it is not however - depends mostly on how far out the next deadline was! Regardless, do report bugs to me so I can try to find the time to fix them. I'll also consider feature requests, so don't be afraid to ask.

If you use this code for your own research then it would be great if you could cite me. Much of the code is for specific papers - that will usually be given in the description, so the paper it was written for would be the obvious choice. When that is not the case, as in the code is stand alone, a competitor I was testing against or for a failed research project (!), then something like:

@MISC{helit,
  author = {T. S. F. Haines},
  title = {\url{https://github.com/thaines/helit}},
  year = {2010--},
}

If you're doing cool stuff with my code - industry, research, personal project - it does not matter, then I would love to hear about it:-)

If anyone wants to contact me for any of the above reasons, or any other good reason, then my email address is [x]@[y].com where [x]=thaines and [y]=gmail

Current contents (alphabetical):

ddhdp: (GPL 3.0) My Delta-Dual Hierarchical Dirichlet Processes implementation, from the paper 'Delta-Dual Hierarchical Dirichlet Processes: A pragmatic abnormal behaviour detector' by T. S. F. Haines and T. Xiang (Can be obtained from my website.). It shares a lot of code with the dhdp implementation, unsurprisingly, making similarly extensive use of scipy.weave. Also, if dhdp is complex this is bordering on the insane - its an awful lot of very complex code, compliments of a lot of variables that need to be Gibbs sampled.

ddp: (Apache 2.0) Discrete dynamic programming. Nothing special.

df: (Apache 2.0) A decision forest (random forest) implementation. It is extremely modular, for future expandability, but rather limited. Very flexible, with support for incremental learning and both continuous and discrete features. Currently only supports classification with a limited set of test generation techniques, which is basically the standard feature set for a typical random forest implementation. Currently it is Python with numpy/scipy only; code is well commented and should be relatively easy to understand.

dhdp: (Apache 2.0) A Dual Hierarchical Dirichlet processes implementation, using Gibbs sampling. Also includes the ability to switch off the document clustering and obtain a HDP implementation. Its rather complex - a lot is going on, and its design is arguably not the best. Makes extensive use of scipy.weave as most of the code is actually in C++.

dp_al: (GPL 3.0) An active learning method of mine, that uses Dirichlet processes. The relevant paper is 'Active Learning using Dirichlet Processes for Rare Class Discovery and Classification' by T. S. F. Haines and T. Xiang, from BMVC 2011 (Can be obtained from my website.). A very simple algorithm that works very well when faced with the competing goals of finding new categories, particularly rare ones, and refining the classification boundaries of existing categories. It is a bit absurd me putting code up, given that implementation is so easy, so the main value here is the test code and the framework that allows for comparison with a bunch of other active learning algorithms that are also implemented within this module.

dpgmm: (Apache 2.0) A Dirichlet process Gaussian mixture model, implemented using the mean field variational technique. Its about as good as a general purpose density estimator can get, though it suffers from a heavy computational and memory burden. Code is in pure python, depends on the gcp module and it is very neat and reasonably well commented - speed is reasonable for the method as the code vectorises well. Unlike some other implementations handles incremental learning correctly - both adding extra sticks to the model and adding extra data after convergence. This code was used for: https://arxiv.org/abs/1801.08009 - that or my background subtraction journal make sense to cite if you use this code.

dp_utils: (BSD) A collection of C++ code for handling Dirichlet processes. Used by other modules in the system (dhdp and ddhdp).

frf: (Apache 2.0) A standard random forest implementation, all in C and designed for speed, with good I/O and support for multi-processing plus all the usual bells and whistles. Was created for the paper 'My Text in Your Handwriting' when I got frustrated with the scikit learn implementations horrible I/O speeds.

gbp: (Apache 2.0) Gaussian belief propagation implementation. Also does Gaussian TRW-S, - make of that what you will! This is really just a simple linear solver, but with an interface that is really helpful for solving certain problems. If your model is a chain then its Kalman smoothing. Interface is fairly advanced, and will allow you to solve problems, modify them, then solve them again, quicker because you're already close. It was the core approach of my paper 'Integrating Stereo with Shape-from-Shading derived Orientation Information', though this is a reimplementation done for my more recent paper, 'My Text in Your Handwriting'. Includes a script for solving sparse linear problems and another for removing curl from normal maps.

gcp: (Apache 2.0) The code associated with my Gaussian conjugate prior cheat sheet. Basically an implementation of exactly what you would expect, though it has a few extra features/tweaks since I uploaded it as an archive to download from my website due to it seeing some real world usage. It of course has a conjugate prior class, with the ability to provide new samples and draw a Gaussian or draw a Student-t distribution that is the probability of a sample drawn from the Gaussian drawn from the prior, but with the Gaussian integrated out. Additionally classes to represent Gaussian, Wishart and Student-t distributions are provided, as well as an incremental Gaussian fitter. All written in pure python; actually fast, due to using eigensum!

gmm: (Apache 2.0) A fairly standard Gaussian mixture model library, that uses k-means for initialisation followed by EM to fit with the actual model. Only supports isotropic kernels, but then non-isotropic kernels can cause stability issues with EM. Has a Bayesian information criterion (BIC) implementation, for automatically selecting the number of clusters. Includes some very nice, and particularly fast, k-means implementations, in case that is all you want. Makes extensive use of scipy.weave.

graph_cuts: (Apache 2.0) Just a max flow implimentation and an interface for doing binary labelings on nD grids.

handwriting: (Mixed licenses) All of the code specific to the paper 'My Text in Your Handwriting', by T. S. F. Haines (me), O. Mac Aodha and G. J. Brostow.

hg: (Apache 2.0) Simple module for constructing and applying homographies to images, plus some related gubbins.

kde_inc: (Apache 2.0) A simple incremental kernel density estimation module with Gaussian kernels, that uses a fixed number of kernels and greedily merges kernels together when it exceeds that cap. Also includes leave one out optimisation of the kernel size, in terms of a symmetric precision matrix. Nothing special really - I only implemented this so I could match the experimental setup of another paper. Its simple and neat however, and I guess could be useful when the fixed time property matters, though if that is really the case you would probably want a C version, rather than this python-only snail.

lda_gibbs: (Apache 2.0) A latent Dirichlet allocation implementation using Gibbs sampling. Written as a learning exercise and so not the best organised code, but fairly well commented, so has possible educational value. Uses scipy.weave and is hence reasonably fast.

lda_var: (Apache 2.0) A latent Dirichlet allocation implementation using the mean field variational method. Again, written as a learning exercise. Code is disturbingly neat, plus short, and probably quite educational. It is in straight python without any C++, so it is easy to run - it could of course be faster using inline code, but still manages to be an order of magnitude faster than the Gibbs approach, which does use C++.

misc: (Apache 2.0) Assorted miscellaneous things - a dumping ground for things that are not large enough to justify a module of their own.

ms: (Apache 2.0) Provides a mean shift implementation, but also includes kernel density estimation and subspace constrained mean shift using the same object, such that they are all using the same underlying density estimate. Also has the multiplication method from the "non-parametric belief propagation" paper, fully generalised to work with all possible kernels - this code makes it trivial to do belief propagation over continuous data with complex kernels. Includes multiple spatial indexing schemes and kernel types, including multiple options for different kinds of directional data, and the ability to combine kernels. For instance, you can have a Gaussian on position and a mirrored Fisher on rotation (as a Quaternion) if doing density estimation on the position and rotation of objects. You can even then do belief propagation on relative object positions using the multiplication capability. Clustering is supported, with a choice of cluster intersection tests, as well as the ability to interpret exemplar indexing dimensions of the data matrix as extra features, so it can handle the traditional image segmentation scenario sensibly. Can do on the fly conversion of data types, so you can have an angle in radians in the feature vector that is converted to a unit vector with a Fisher distribution over it, but only internally, so its returned to the user as an angle, for instance. This module is all in C, and has a setup.py file so it can be installed. Its also one of my most heavily engineered modules, with loads of test files (some of which create really pretty outputs;-) ). The code could probably be used in production, and was used by my paper 'My Text in Your Handwriting'.

p_cat (Apache 2.0) A bunch of probabilistic classifiers, using some of the density estimation methods also in this code base. Does it all via a standard interface, so they can be swapped out, and includes incremental learning and, somewhat unusually, the probability of a sample belonging to an unknown class, as calculated under a Dirichlet process assumption (this last feature is for a paper).

ply2 (BSD) My extension of the ply file format, for more general use. Includes a specification and pure Python reading/writing code. Intention is for this to be a good format to fill the gap between json and hdf5, particularly for large quantities of human readable data. Was created for the handwriting project, so particular attention has been made to string support. Done without the permission of the original authors, so I hope they don't mind!

rlda: (GPL 3.0) My region LDA implementation - see the paper 'Video Topic Modelling with Behavioural Segmentation' by T. S. F. Haines and T. Xiang (Downloadable from my website.). A topic model that includes a behavioural segmentation within the same model, which is simultaneously solved. Designed to be good at analysing traffic data by virtue of it inferring the regions of the input where each activity occurs, giving it a better generalisation capability.

smp: (GPL 3.0) This Sparse Multinomial Posterior library solves the rather unusual problem of estimating a multinomial given draws from it, when those draws are sparse. That is, when the counts for some of the entries are missing. Generates the expected value of the multinomial being estimated. Uses scipy.weave.

svm: (Apache 2.0) A support vector machine implementation, which only supports the traditional classification scenario. It is however implemented using some of the best available methods, and includes leave one out model selection with a decent set of kernel types and various useful defaults for when you just want to throw data at it. Its not as optimised as it could be, but is still plenty fast. I know there are already various other libraries for doing this with Python, but my testing showed serious issues with all of them, specifically strange restrictions, lack of multiclass support and an inability to save models etc, not to mention absolutely dire interfaces and little documentation. This is only from a Python point of view however - if you don't care which language you use then there are better libraries. Also, the above is a description of a historic problem, which is no longer the case.

swood: (Apache 2.0) My old random forest implementation - perfectly functional, but limited, and was created as a learning exercise. I would recommend using one of my other implementations instead.

utils: (BSD) Basic utilities used by almost all of my modules. Most useful are the progress bar class, and the make code, that means all my Python C modules automatically compile - great when doing research as forgetting to run make is no longer a problem! Unusual stuff includes python code to change a python programs process name (Good for those killall's), a function that adds line numbers to scipy.weave code, which is essential if you want to debug them, a multiprocess map replacement, and code for generating documentation.

utils_gui: (BSD) An image viewer for GTK 3. Works using a tile concept (for speed) and supports efficient zooming, absurdly large images and using a tablet to control it.

video: (GPL 3.0) A node based system for processing video, that includes various computer vision algorithms, e.g. optical flow, background subtraction. This last one is my own algorithm: 'Background Subtraction with Dirichlet Processes' by Tom SF Haines & Tao Xiang, ECCV 2012. Uses OpenCV and scipy; also compiles C code and uses OpenCL when available.

helit's People

Contributors

thaines avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

helit's Issues

Need to upgrade project to Python3 or a documentation for requirements

Dear Thaines,

Your project is very interesting, but unfortunately, since Python 2 and many of its packages are deprecated, it is a hassle to run it even with Python 2 on Linux. I think the quickest solution to this problem would be to write a brief documentation on how to prepare the build environment. Both myself and other people interested in your project would appreciate it if you could prepare such documentation. Additionally, the right and long-term solution is to migrate the project to Python 3 because as time passes, more and more packages are deprecated, making it harder and harder to prepare the build environment.

help needed -- just getting started

Hello. I would very much like to try out "My text in your handwriting." I am following the instructions, utilizing Ubuntu 18.04 (on WSL2 in Windows10, but I don't think that should make a difference). I am using Python 2.7. I have scipy and numpy installed. But when I try to run one of the test python files in the frf directory (as suggested in the instructions, to test that the first of the two compiling methods is working), I get the following error, no matter which file I run.

(venv) scott@Yogi:~/envs/helit/frf$ python test_weight.py
Traceback (most recent call last):
  File "test_weight.py", line 13, in <module>
    import frf
  File "/home/scott/envs/helit/frf/frf.py", line 23, in <module>
    from frf_c import *
ImportError: /home/scott/envs/helit/frf/frf_c.so: undefined symbol: SummaryPtr

(And among possible problems mentioned in the instructions: I already have gcc installed; and I have the python-dev version installed; however, I do not have a dev version of numpy installed since I don't really understand what that is or how to install it. I just have "regular" numpy installed, I guess.)

I sure hope someone can please help me out! (It doesn't look like there's been much activity here for a while.) Thank you.

Requirements (let dependancies)

Hello,
I'm trying to launch yout helit software (let part for now), but I have a problem with dependancies.
Could you please publish a requirements file? (even the all output of pip freeze including unused package for this app of a python installation that is able to run 'let' app?)

Thanks in advance

LET: Thresholding causes an exception

I tried clicking on the scissor icon labeled "Toggles if the threshold for the image is shown or not" but it gave me the following:

Calculating density model...
Traceback (most recent call last):
  File "/home/ytan/Dev/handwriting/helit/handwriting/let/let.py", line 743, in __threshold_visible
    self.run_threshold()
  File "/home/ytan/Dev/handwriting/helit/handwriting/let/let.py", line 684, in run_threshold
    _, self.density = threshold.cluster_colour(image, size=self.threshold_cluster_size, halves=self.threshold_cluster_halves)
  File "/home/ytan/Dev/handwriting/helit/handwriting/let/threshold.py", line 352, in cluster_colour
    tps.learn(dm_x * scale, dm_y)
  File "/home/ytan/Dev/handwriting/helit/handwriting/let/misc/tps.py", line 47, in learn
    assert(y!=None or (a!=None and b!=None))
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

Issues compiling files

Hi!
I'm trying to follow the instructions in the READ.ME file for the project and am working on getting the test file 'test_continuous_to_continuous.py' to run, as the steps suggest running test files from helit/frf. I am currently encountering the following error when trying to run setup.py to create frf_c.so:
veronica@veronica-virtual-machine:~/helit/frf$ sudo python2 setup.py install
[sudo] password for veronica:
running install
running bdist_egg
running egg_info
writing frf.egg-info/PKG-INFO
writing top-level names to frf.egg-info/top_level.txt
writing dependency_links to frf.egg-info/dependency_links.txt
reading manifest file 'frf.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'frf.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
running build_ext
There is a workaround to now inherit optimization CFLAGS when compiling wheels.
To enable this, set APPLY_LP2002043_UBUNTU_CFLAGS_WORKAROUND in your
environment. See LP: https://launchpad.net/bugs/2002043 for further context.
APPLY_LP2002043_UBUNTU_CFLAGS_WORKAROUND not detected.
building 'frf_c' extension
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c philox.c -o build/temp.linux-x86_64-2.7/philox.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c data_matrix.c -o build/temp.linux-x86_64-2.7/data_matrix.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c summary.c -o build/temp.linux-x86_64-2.7/summary.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c information.c -o build/temp.linux-x86_64-2.7/information.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c learner.c -o build/temp.linux-x86_64-2.7/learner.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c index_set.c -o build/temp.linux-x86_64-2.7/index_set.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c tree.c -o build/temp.linux-x86_64-2.7/tree.o
x86_64-linux-gnu-gcc -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/local/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c frf_c.c -o build/temp.linux-x86_64-2.7/frf_c.o
x86_64-linux-gnu-gcc -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -flto=auto -ffat-lto-objects -flto=auto -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -Wl,-Bsymbolic-functions -flto=auto -ffat-lto-objects -flto=auto -Wl,-z,relro -Wdate-time -D_FORTIFY_SOURCE=2 -g -ffile-prefix-map=/build/python2.7-RH0SVf/python2.7-2.7.18=. -flto=auto -ffat-lto-objects -flto=auto -ffat-lto-objects -fstack-protector-strong -Wformat -Werror=format-security -fPIC build/temp.linux-x86_64-2.7/philox.o build/temp.linux-x86_64-2.7/data_matrix.o build/temp.linux-x86_64-2.7/summary.o build/temp.linux-x86_64-2.7/information.o build/temp.linux-x86_64-2.7/learner.o build/temp.linux-x86_64-2.7/index_set.o build/temp.linux-x86_64-2.7/tree.o build/temp.linux-x86_64-2.7/frf_c.o -o build/lib.linux-x86_64-2.7/frf_c.so
/usr/bin/ld: build/temp.linux-x86_64-2.7/learner.o (symbol from plugin): in function Idiot_new': (.text+0x0): multiple definition of BiGaussianInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/learner.o (symbol from plugin): in function Idiot_new': (.text+0x0): multiple definition of GaussianInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/learner.o (symbol from plugin): in function Idiot_new': (.text+0x0): multiple definition of CategoricalInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/learner.o (symbol from plugin): in function Idiot_new': (.text+0x0): multiple definition of NothingInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/index_set.o (symbol from plugin): in function IndexSet_new': (.text+0x0): multiple definition of OneCatLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/index_set.o (symbol from plugin): in function IndexSet_new': (.text+0x0): multiple definition of SplitLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/index_set.o (symbol from plugin): in function IndexSet_new': (.text+0x0): multiple definition of IdiotLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of BiGaussianSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of GaussianSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of CategoricalSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of NothingSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of BiGaussianInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of GaussianInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of CategoricalInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of NothingInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of OneCatLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of SplitLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/tree.o (symbol from plugin): in function PtrArray_new': (.text+0x0): multiple definition of IdiotLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of OneCatLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of SplitLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of IdiotLearner'; build/temp.linux-x86_64-2.7/learner.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of BiGaussianInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of GaussianInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of CategoricalInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of NothingInfo'; build/temp.linux-x86_64-2.7/information.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of BiGaussianSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of GaussianSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of CategoricalSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
/usr/bin/ld: build/temp.linux-x86_64-2.7/frf_c.o (symbol from plugin): in function CallbackReport': (.text+0x0): multiple definition of NothingSummary'; build/temp.linux-x86_64-2.7/summary.o (symbol from plugin):(.text+0x0): first defined here
collect2: error: ld returned 1 exit status
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1

Do other header/source files in different directories need to be compiled to resolve this issue? What can I do to run the program?

about the import path

I just begin to learn the Python,when I run the code in helit-master/handwriting/let$ python main.py
ImportError: No module named line_graph.utils_gui.viewer. I don't quite understand the file like 'utils_gui' in only write '../../utils_gui/' ,how can the python find the path.

How to build C modules while using virtualenv

I get the following error because C code doesn't find numpy headers
how to tell it to look for it inside my virtualenv folder env?

python test_discrete_to_discrete.py 
b
running build_ext
building 'frf_c' extension
creating /tmp/tmpD18M5k/home
creating /tmp/tmpD18M5k/home/daniel
creating /tmp/tmpD18M5k/home/daniel/helit
creating /tmp/tmpD18M5k/home/daniel/helit/frf
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -fdebug-prefix-map=/build/python2.7-ZZaKJ6/python2.7-2.7.13=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c /home/daniel/helit/frf/philox.c -o /tmp/tmpD18M5k/home/daniel/helit/frf/philox.o
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -fdebug-prefix-map=/build/python2.7-ZZaKJ6/python2.7-2.7.13=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c /home/daniel/helit/frf/data_matrix.c -o /tmp/tmpD18M5k/home/daniel/helit/frf/data_matrix.o
In file included from /home/daniel/helit/frf/data_matrix.c:11:0:
/home/daniel/helit/frf/data_matrix.h:20:31: fatal error: numpy/arrayobject.h: No such file or directory
 #include <numpy/arrayobject.h>
                               ^
compilation terminated.
Traceback (most recent call last):
  File "test_discrete_to_discrete.py", line 13, in <module>
    import frf
  File "/home/daniel/helit/frf/frf.py", line 23, in <module>
    from frf_c import *
ImportError: No module named frf_c

The result model varies for each run

Hi,

   I have little knowledge on DP. However, I used the code to estimate the density of some mixtures and try to get the mixture components, It seems that the algorithm produces different results each time. The self.v values differ each time.

   I am not sure whether this is a bug or it is the nature of the variational inference algorithm.


Original issue reported on code.google.com by [email protected] on 17 Sep 2013 at 11:23

ValueError - Numpy?

Hey Tom,
first of all thank you for your work and research - it´s very impressive!
I listend to your interview on the german radio station 'dlf' and started to learn some basics of python (so i´m fairly a newbie but it´s alot of fun to learn that stuff) .. after some - or more troubleshooting I got your handwriting code to run - nearly...
The GUI is working on both (hst and let) but if I try to load an image, generate or calculate stuff I get this error:

Loading image (image shows up in the let-gui) [let]
loading image

Toggle threshold [let]
toggle

Generate with an already tagged image [hst]
generate

I already tried older versions of numpy as you commented that some numpy features were depreciated - but that didn´t fixed my problems ... maybe it´s something simple you or somebody else can help me with to finally get it to run properly :)

Thank you in advance for your help!

Installing ?

How to install and use My Text in Your Handwriting into my windows or mac ? I know this is use phyton, but can you guys guide me ?

import error : cannot import name weave

I tried to run make_doc.py in dhdp with pycharm, then pycharms console error : import error : cannot import name weave
I try to make a new python file which only contains code:

from solve_weave import gibbs_all, gibbs_doc

It still shows the same error. Please help me to find why. thx
tim 20180103090403

Windows?

Hello
I want to use this for an essay my stupid teacher wants me to be handwritten, but I use windows.
Is there another version of the software you might develop recently, or is there any alternative?
I think this could be huge among the students, but it seems like you guys discontinued or something.
It would be great if you give me a solution for this!

TypeError: Couldn't find foreign struct converter for 'cairo.Context'

When I run utils_gui/image_viewer.py script this is what I get:

python utils_gui/image_viewer.py 
/home/daniel/helit/utils_gui/viewer.py:16: PyGIWarning: Gtk was imported without specifying a version first. Use gi.require_version('Gtk', '3.0') before import to ensure that the right version gets loaded.
  from gi.repository import Gtk, Gdk, GdkPixbuf
TypeError: Couldn't find foreign struct converter for 'cairo.Context'
TypeError: Couldn't find foreign struct converter for 'cairo.Context'
TypeError: Couldn't find foreign struct converter for 'cairo.Context'
TypeError: Couldn't find foreign struct converter for 'cairo.Context'

And it opens a blank image.

I'm running the code on ubuntu 17.04 x64

Would appreciate any type of help.

cost_proxy error on Handwriting synthesis generate

Hey there,

This looks like a really interesting project. I've managed to get it compiled on Mac OSX, the line graph viewer works well, the HST main functions perfectly, but when I hit generate, I get the following error:

Starting generation...
L1::Selecting Glyphs...
Traceback (most recent call last):
File "/Users/lukeandkimberlee/helit/handwriting/hst/hst.py", line 613, in __generate
glyph_list = select_glyphs_dp(txt[i], self.glyph_db, fetch_count, match_strength, wrong_place_cost, space_mult, cfunc, True)
File "/Users/lukeandkimberlee/helit/handwriting/hst/generate.py", line 133, in select_glyphs_dp
cost[j,i] = mult * cost_func(left, right)
File "/Users/lukeandkimberlee/helit/handwriting/hst/costs.py", line 93, in end_dist_cost_rf
cost_proxy = frf.load_forest('cost_proxy.rf')
File "/Users/lukeandkimberlee/helit/handwriting/hst/frf/frf.py", line 43, in load_forest
f = bz2.BZ2File(fn, 'r')
IOError: [Errno 2] No such file or directory: 'cost_proxy.rf'

There is no cost_proxy.rf in the 'hst' directory. Do I have to run one of the python files to generate it first, or am I missing something here?

BTW I'm using the twin_older fineliner line graph to test with if that matters.

possible bug in wishart sampler

I randomly came across your code as I was trying to confirm the correctness of 
my own Wishart sampler, and I think you might have a bug. Instead of:

  random.gammavariate(0.5*(self.dof-d+1),2.0)

I think it should be:

  random.gammavariate(0.5*(self.dof-r),2.0)

Anyway, sorry to bother you, and definitely let me know if I'm wrong. :) Thanks.

Original issue reported on code.google.com by [email protected] on 2 May 2013 at 7:05

License clarification requested

Hi, in the README for handwriting you say that you use "GNU Affero GPL 3.0 for the bits we want to license (research use is OK, but commercial use would be extremely problematic)". However, the Gnu Affero GPL 3 license doesn't place any restrictions on commercial use (as long as modified source code is offered).

I'm asking because I work for a small startup that's contemplating adding handwritten notes to our greeting card/sharing service we're launching. We came across Handwriting.io and it looked neat but pricy, so we started hunting around for other possibilities and found this.

I don't want to step on any toes, but it's not clear if you're actually trying to prohibit commercial use of the handwriting library or not.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.