Giter Club home page Giter Club logo

x-stream's People

Contributors

bindscha avatar junyaozhao avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

x-stream's Issues

Error during build due to hdfs.h

Hello,

I'm trying to build x-stream. I followed your instruction at INSTALL.md.
It looks like I'm missing a library. I've installed Hadoop (http://hadoop.apache.org/) and set HADOOP_HOME.
Do you have any advise how I can fix my problem?

Thanks in advance

$make
g++ -O3 -g -Wall -Wno-unused-function -Wfatal-errors -DCOMPACT_GRAPH -DZLIB_COMPRESSION_LEVEL=Z_BEST_SPEED -msse4.2 -include core/types.h -I/usr/local/include/boost-numeric-bindings -IADOOP_HOME/include -c -o object_files/core.o core/core.cpp
In file included from core/../utils/memory_utils.h:34:
from` core/autotuner.hpp:22,
from core/split_stream.hpp:21,
from core/disk_io.hpp:21,
from core/x-lib.hpp:21,
from core/core.cpp:19:
core/../utils/../core/hdfs_io.h:12:23: fatal error: hdfs/hdfs.h: No such file or directory
#include <hdfs/hdfs.h>
^
compilation terminated.
make: *** [object_files/core.o] Error 1

make error

Hello,

I'm trying to build x-stream. I followed your instruction at INSTALL.md.
It looks like I'm missing a library. I've installed Hadoop and set HADOOP_HOME.
Do you have any advise how I can fix my problem?
Thanks in advance

$make
g++ -O3 -g -Wall -Wno-unused-function -Wfatal-errors -DCOMPACT_GRAPH -DZLIB_COMPRESSION_LEVEL=Z_BEST_SPEED -msse4.2 -include core/types.h -I/usr/local/include/boost-numeric-bindings -IADOOP_HOME/include -c -o object_files/core.o core/core.cpp
In file included from core/../utils/memory_utils.h:34:
from` core/autotuner.hpp:22,
from core/split_stream.hpp:21,
from core/disk_io.hpp:21,
from core/x-lib.hpp:21,
from core/core.cpp:19:
core/../utils/../core/hdfs_io.h:12:23: fatal error: hdfs/hdfs.h: No such file or directory
#include <hdfs/hdfs.h>
^
compilation terminated.
make: *** [object_files/core.o] Error 1

Conductance.cpp the base_edge shoud the set with less edges.

if(conductance_per_processor_data::edges_from_red >
	   conductance_per_processor_data::edges_from_black) {
	  base_edges = conductance_per_processor_data::edges_from_red;
	}
	else {
	  base_edges = conductance_per_processor_data::edges_from_black;
	}
		  
	weight_t conductance_value = 
	  ((weight_t)conductance_per_processor_data::crossover_edges)/
	  base_edges;
	BOOST_LOG_TRIVIAL(info) << "ALGORITHM::CONDUCTANCE::VALUE "
				<< conductance_value;

The base edge taken in the code is the set with the most edges in it, however if I am not mistaken it should be the set with least edges in it.
image

Unable to open stream file:No such file or directory

I follow the step which is described in the READ.md. But when I run the program with different datasets, there is an error--"Unable to open stream file:No such file or directory". The files named "test" and "test.ini" both existed under the "generator" folder.
The detail is:

[root@node01 bin]# ./benchmark_driver -g ../generators/test -b pagerank --pagerank::niters 10 -a -p 16 --physical_memory 268435456
<INFO 7f0e7a877700> STARTUP
<INFO 7f0e7a877700> CORE::GRAPH ../generators/test
<INFO 7f0e7a877700> CORE::ALGORITHM pagerank
<INFO 7f0e7a877700> SG-DRIVER-ORIGINAL
<WARNING 7f0e7a877700> Disk count not specified, assuming it to be 1
<INFO 7f0e7a877700> CORE::CONFIG::PROCESSORS 16
<INFO 7f0e7a877700> CORE::CONFIG::PHYSICAL_MEMORY 268435456
<INFO 7f0e7a877700> CORE::CONFIG::VERTICES 1048576
<INFO 7f0e7a877700> CORE::CONFIG::VERTEX_SIZE 12
<INFO 7f0e7a877700> CORE::CONFIG::VERTEX_BUFFER 12582912
<INFO 7f0e7a877700> CORE::CONFIG::EDGES 16777216
<INFO 7f0e7a877700> CORE::CONFIG::PARTITIONS 16
<INFO 7f0e7a877700> CORE::CONFIG::FANOUT 16
<INFO 7f0e7a877700> CORE::CONFIG::SUPER_PARTITIONS 1
<INFO 7f0e7a877700> CORE::CONFIG::STREAM_UNIT 16777216
<INFO 7f0e7a877700> CORE::CONFIG::BUFFER_SIZE 47810235
<INFO 7f0e7a877700> CORE::CONFIG::IO_NUMIOQS 1
<INFO 7f0e7a877700> CORE::CONFIG::MAX_BUFFERS 4
<INFO 7f0e7a877700> CORE::CONFIG::MAX_STREAMS 5
<WARNING 7f0e7a877700> No mount point specified for disk0, falling back to current directory
<FATAL 7f0e7a877700> Unable to open stream file:No such file or directory

whether hdfs must have hadoop support?

When I build the environment, I encountered such a problem。

g++ -O3 -g -Wall -Wno-unused-function -Wfatal-errors -DCOMPACT_GRAPH -DZLIB_COMPRESSION_LEVEL=Z_BEST_SPEED -msse4.2 -include core/types.h -I/usr/local/include/boost-numeric-bindings -IADOOP_HOME/include -c -o object_files/core.o core/core.cpp In file included from core/../utils/memory_utils.h:34:0, from core/autotuner.hpp:22, from core/split_stream.hpp:21, from core/disk_io.hpp:21, from core/x-lib.hpp:21, from core/core.cpp:19: core/../utils/../core/hdfs_io.h:12:23: fatal error: hdfs/hdfs.h: No such file or directory #include <hdfs/hdfs.h> ^ compilation terminated. make: *** [object_files/core.o] Error 1

Do I need to have hadoop installed to get hdfs support?

MCST application do not seem to be running correctly?

Hi,
I use x-stream to run MCST algorithm. I only use a tiny benchmark as follows:
image
And transfer it to binary as x-stream default input format.
The command line to run it is:
image
But, the program has entered a dead cycle. (Other algorithms run well)
image
Why is there such a situation? Thank you!!

segmentation fault

Hello, there's a input data file type trouble when I try to use your code,this is the link of the file I used http://snap.stanford.edu/data/wiki-Vote.html . This input file is a edge list file ,I want to convert it to binary edge list used the tool you provided. Unfortunately, some details of your convert program input file has a little bit different between me. So I have to make a little change in the convert file below

for line in infile:

if line[0] == '#':
    pass
else:
    vector = line.strip().split("\t")                               
    vector = list(map(int,vector))
    vector.append(random.random()) # Edge weight
    if vector[0] > vertices:                                          //I delete minus 1 operate here
        vertices = vector[0]
    if vector[1] > vertices:
        vertices = vector[1]
    edges = edges + 1
    outfile.write(s.pack(*vector))
    if add_rev_edges:
        edges = edges + 1
        tmp = vector[0]
        vector[0] = vector[1]
        vector[1] = tmp
        outfile.write(s.pack(*vector))

But when I used the converted file to run the framework, it showed me segmentation fault like below
b4d30da84f440cb2291f295f54a7746
Do you know what might be causing this problem? And is there a standard for your required binary edge list file? When I try to input some other binary edge list file it shows me unable to read property file.
f4cb3c0d590059f548e1f6612c367a6
Thank you very much!

core/../utils/boost_log_wrapper.h:64: error: ‘BOOST_ASSERT_MSG’ was not declared in this scope

When i try to make install,there is a error.I have installed lapack and follow the install.md :

g++ -O3 -g -Wall -Wno-unused-function -Wfatal-errors -DCOMPACT_GRAPH -DZLIB_COMPRESSION_LEVEL=Z_BEST_SPEED -msse4.2 -include core/types.h -I/usr/local/include/boost-numeric-bindings -IADOOP_HOME/include -c -o object_files/core.o core/core.cpp
In file included from core/../utils/memory_utils.h:34,
from core/autotuner.hpp:22,
from core/split_stream.hpp:21,
from core/disk_io.hpp:21,
from core/x-lib.hpp:21,
from core/core.cpp:19:
core/../utils/../core/hdfs_io.h:12:23: error: hdfs/hdfs.h: No such file or directory
In file included from core/autotuner.hpp:21,
from core/split_stream.hpp:21,
from core/disk_io.hpp:21,
from core/x-lib.hpp:21,
from core/core.cpp:19:
core/../utils/boost_log_wrapper.h: In destructor ‘BOOST_LOG_TRIVIAL::~BOOST_LOG_TRIVIAL()’:
core/../utils/boost_log_wrapper.h:64: error: ‘BOOST_ASSERT_MSG’ was not declared in this scope
compilation terminated due to -Wfatal-errors.
make: *** [object_files/core.o] Error 1

Installation help

Ubuntu 18.04
hadoop 3.1.1

I am getting the following error

g++ -O3 -g -Wall -Wno-unused-function -Wfatal-errors -DCOMPACT_GRAPH -DZLIB_COMPRESSION_LEVEL=Z_BEST_SPEED -msse4.2 -include core/types.h -I/usr/local/include/boost-numeric-bindings -IADOOP_HOME/include -c -o object_files/core.o core/core.cpp
In file included from core/../utils/memory_utils.h:34:0,
                 from core/autotuner.hpp:22,
                 from core/split_stream.hpp:21,
                 from core/disk_io.hpp:21,
                 from core/x-lib.hpp:21,
                 from core/core.cpp:19:
core/../utils/../core/hdfs_io.h: In member function ‘void x_lib::hdfs_io::refreshReadHandle(int)’:
core/../utils/../core/hdfs_io.h:124:7: warning: unused variable ‘closeretval’ [-Wunused-variable]
   int closeretval = hdfsCloseFile(fs, fileRDmap[fd]);
       ^~~~~~~~~~~
In file included from core/core.cpp:19:0:
core/x-lib.hpp: In constructor ‘x_lib::streamIO<A>::streamIO()’:
core/x-lib.hpp:477:72: warning: ‘new’ of type ‘x_lib::x_thread’ with extended alignment 64 [-Waligned-new=]
                 workers[i] = new x_thread(config, i, cpu_state_array[i]);
                                                                        ^
core/x-lib.hpp:477:72: note: uses ‘void* operator new(std::size_t)’, which does not have an alignment parameter
core/x-lib.hpp:477:72: note: use ‘-faligned-new’ to enable C++17 over-aligned new support
In file included from core/core.cpp:20:0:
core/sg_driver.hpp: In constructor ‘algorithm::scatter_gather<A, F>::scatter_gather()’:
core/sg_driver.hpp:173:35: warning: ‘new’ of type ‘algorithm::sg_pcpu’ with extended alignment 64 [-Waligned-new=]
       pcpu_array[i] = new sg_pcpu();
                                   ^
core/sg_driver.hpp:173:35: note: uses ‘void* operator new(std::size_t)’, which does not have an alignment parameter
core/sg_driver.hpp:173:35: note: use ‘-faligned-new’ to enable C++17 over-aligned new support
In file included from core/core.cpp:21:0:
core/sg_driver_async.hpp: In constructor ‘algorithm::async_scatter_gather<A, F>::async_scatter_gather()’:
core/sg_driver_async.hpp:153:41: warning: ‘new’ of type ‘algorithm::sg_async_pcpu’ with extended alignment 64 [-Waligned-new=]
       pcpu_array[i] = new sg_async_pcpu();
                                         ^
core/sg_driver_async.hpp:153:41: note: uses ‘void* operator new(std::size_t)’, which does not have an alignment parameter
core/sg_driver_async.hpp:153:41: note: use ‘-faligned-new’ to enable C++17 over-aligned new support
Assembler messages:
Fatal error: can't create object_files/core.o: No such file or directory
Makefile:61: recipe for target 'object_files/core.o' failed
make: *** [object_files/core.o] Error 1

fatal error: hdfs/hdfs.h: No such file or directory

when I try to make, there is a error. Could you please provide a way to insatll the library.

g++ -O3 -g -Wall -Wno-unused-function -Wfatal-errors -DCOMPACT_GRAPH -DZLIB_COMPRESSION_LEVEL=Z_BEST_SPEED -msse4.2 -include core/types.h -I/usr/include/boost-numeric-bindings -IADOOP_HOME/include -c -o object_files/core.o core/core.cpp
In file included from core/../utils/memory_utils.h:34:0,
from core/autotuner.hpp:22,
from core/split_stream.hpp:21,
from core/disk_io.hpp:21,
from core/x-lib.hpp:21,
from core/core.cpp:19:
core/../utils/../core/hdfs_io.h:12:23: fatal error: hdfs/hdfs.h: No such file or directory
#include <hdfs/hdfs.h>
^
compilation terminated.
make: *** [object_files/core.o] Error 1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.