Giter Club home page Giter Club logo

t2vec's People

Contributors

boathit avatar striderdu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

t2vec's Issues

Significance of index offset in saveKNearestVocabs

Hello,

I had been looking through code and porting parts of it python. In saveKNearestVocabs there is a part that has an offset in a for loop over the vocab, at first I thought it was just because differences between julia being 1-indexed and python being 0-indexed but now I am not sure

function saveKNearestVocabs(region::SpatialRegion, datapath::String)
    V = zeros(Int, region.k, region.vocab_size)
    D = zeros(Float64, region.k, region.vocab_size)
    for vocab in 0:region.vocab_start-1
        V[:, vocab+1] .= vocab
        D[:, vocab+1] .= 0.0
    end
    for vocab in region.vocab_start:region.vocab_size-1
        cell = region.vocab2hotcell[vocab]
        kcells, dists = knearestHotcells(region, cell, region.k)
        kvocabs = map(x->region.hotcell2vocab[x], kcells)
        V[:, vocab+1] .= kvocabs
        D[:, vocab+1] .= dists

The resulting file just has an empty first entry in the V and D arrays, since the PAD token is actually at index 1 and then the vocab which starts at 4 is now at index 5. Is there a downstream motivation for this or just how it was first implemented?

Problems of preprocess.jl

when i run julia preprocess.jl

Building spatial region with:
        cityname=porto,
        minlon=-8.735152,
        minlat=40.953673,
        maxlon=-8.156309,
        maxlat=41.307945,
        xstep=100.0,
        ystep=100.0,
        minfreq=100
Creating paramter file /home/nchen/Projects/t2vec/data/porto-param-cell100
ERROR: LoadError: BoundsError: attempt to access 0-element Array{Pair{Int64,Int64},1} at index [0]
Stacktrace:
 [1] getindex(::Array{Pair{Int64,Int64},1}, ::Int64) at ./array.jl:809
 [2] makeVocab!(::SpatialRegion, ::String) at /home/nchen/Projects/t2vec/preprocessing/SpatialRegionTools.jl:205
 [3] top-level scope at /home/nchen/Projects/t2vec/preprocessing/preprocess.jl:54
 [4] include(::Function, ::Module, ::String) at ./Base.jl:380
 [5] include(::Module, ::String) at ./Base.jl:368
 [6] exec_options(::Base.JLOptions) at ./client.jl:296
 [7] _start() at ./client.jl:506
in expression starting at /home/nchen/Projects/t2vec/preprocessing/preprocess.jl:49

Usage with Custom dataset

Hi,
Nice work! I was planning to use this algorithm with a custom dataset that I have, I have the timestamp, latitude, longitude and userID, and want to convert each trajectory to a vector using this algorithm. Can you please guide me on how I can use this code with the dataset I described above?
Thanks

data meaning

Hello, I ran a bit with your code, and found that the meaning of some data is not very understandable, including: querysize::Int, dbsize::Int,
And I used your code to select a small part of the data to test the usability of the code, but an error was reported when generating the vector.

Clustering task with t2vec

Hi,

I have a cluster task with porto dataset. I use your best_model.pt to get the trajectory representation(about 1w data), and use kmeans to get cluster result.

But compare with other traditional measurement(like LCSS/EDR), the acc/nmi is really bad (0.2 vs 0.6), I am confused about the results. Can you tell me why could it happen?

Thanks a lot!

Problem about the released best_model on Porto

Hi boathit,
I plan to use your trained model on Porto. However, I meet a problem that I can't reproduce your results by using your codes and your released best_model. The performance of your experiment 1 (querysize=1000, dbsize=100000) in terms of mean rank is 12152.9840. My versions of python and pytorch are 3.6 and 1.5. I wonder know that the best_model is saved on which version and whether this fault is caused by version.
P.S. I absolutely follow your script to do the experiment. Moreover, there are no any errors when I run the codes.

Different test results of EDR and LCSS

Hi, thanks for your help last time, and now I am getting to the point of testing baseline methods.
My test data was generated accroding to your paper (and I also compared my code with yours, which is basicly same). I also use the code of authors of EDwP to implement baseline methods.

The mean rank of EDwP is similar to yours, which convince me the test data is fine.
However, when it comes to EDR and LCSS, the results differ a lot, eg. when db size is 60k, the mean rank of LCSS is 2.109, EDR is 0.013, which is quite different from yours.

I guess the problem might due to the threshold setting.
I set the threshold of EDR and LCSS to a quarter of the maximum standard deviation between the two trajectories that were examined, which is the suggestion from the authors of EDR.
(To make it clear, lets assume the two trajectories are named traj_a and traj_b, then calculate the std on the two dimensions of both traj_a and traj_b, which yield four candidates. And choose a quarter of the maximum std as the threshold.)

I noticed that in the paper you said you adopted the the strategies described in LCSS and EDR.
So I am wondering the detail of your implements of EDR and LCSS.

I'll be appreciated for your help!
Thanks!

pytorch version problem?

  File "/home/hjhuang/weichen/t2vec/train.py", line 271, in train
    loss.backward()
  File "/home/hjhuang/anaconda3/envs/t2vec2/lib/python3.6/site-packages/torch/tensor.py", line 102, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph)
  File "/home/hjhuang/anaconda3/envs/t2vec2/lib/python3.6/site-packages/torch/autograd/__init__.py", line 90, in backward
    allow_unreachable=True)  # allow_unreachable flag
RuntimeError: CUDA error: device-side assert triggered
/opt/conda/conda-bld/pytorch_1544081127912/work/aten/src/THC/THCTensorScatterGather.cu:97: 
void THCudaTensor_gatherKernel(TensorInfo<Real, IndexType>, TensorInfo<Real, IndexType>, 
TensorInfo<long, IndexType>, int, IndexType) [with IndexType = unsigned int, Real = float, Dims = 2]: 
block: [13,0,0], thread: [385,0,0] Assertion `indexValue >= 0 && indexValue < src.sizes[dim]` failed.

I have run the Training after preprocess successfully and error occurred as above.

Should i run training first?

I am a beginner of trajectory data mining, I notice that the embedding is written after training in README. I am a little confused. What I mean is that the embedding layer is a part of the encoder-decoder model so shouldn't i train this layer first? And then use it to the entire encoder-decoder model?
I really can not figure this out :( can you help me ?

Generative Loss is negative

I use t2vec to train ads-b data, the dataset is from opensky, download link is opensky
the date of dataset I used is from 2022-05-02 to 2022-06-27.
The hyper-parameters are:

{
  "region": {
    "cityname": "opensky",
    "minlon": 73.49901302691623,
    "minlat": 3.83788851813507,
    "maxlon": 135.08737696307114,
    "maxlat": 53.5616571938264,
    "cellsize": 100.0,
    "minfreq": 20
  }
}

args are:

Namespace(data='E:/codes/github/t2vec/data', checkpoint='E:/codes/github/t2vec/data/checkpoint.pt', prefix='opensky', pretrained_embedding=None, num_layers=3, bidirectional=True, hidden_size=256, embedding_size=256, dropout=0.2, max_grad_norm=5.0, learning_rate=0.001, batch=128, generator_batch=32, t2vec_batch=256, start_iteration=0, epochs=15, print_freq=50, save_freq=1000, cuda=True, use_discriminative=False, discriminative_w=0.1, criterion_name='KLDIV', knearestvocabs='data/opensky-vocab-dist-cell100.h5', dist_decay_speed=0.8, max_num_line=20000000, max_length=200, mode=0, vocab_size=20000, bucketsize=[(20, 30), (30, 30), (30, 50), (50, 50), (50, 70), (70, 70), (70, 100), (100, 100)])

training logs are:

Iteration: 0    Generative Loss: 2.175  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 50   Generative Loss: 1.196  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 100  Generative Loss: 1.089  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 150  Generative Loss: 0.731  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 200  Generative Loss: 0.528  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 250  Generative Loss: 0.376  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 300  Generative Loss: 0.278  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 350  Generative Loss: 0.226  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 400  Generative Loss: 0.195  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 450  Generative Loss: 0.092  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 500  Generative Loss: 0.077  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 550  Generative Loss: 0.077  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 600  Generative Loss: 0.076  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 650  Generative Loss: 0.047  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 700  Generative Loss: 0.052  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 750  Generative Loss: -0.023 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 800  Generative Loss: -0.016 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 850  Generative Loss: -0.010 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 900  Generative Loss: -0.031 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 950  Generative Loss: -0.023 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1000 Generative Loss: -0.021 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Saving the model at iteration 1000 validation loss 76.77668204471983
Iteration: 1050 Generative Loss: -0.057 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1100 Generative Loss: -0.052 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1150 Generative Loss: -0.050 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1200 Generative Loss: -0.056 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1250 Generative Loss: -0.060 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1300 Generative Loss: -0.081 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1350 Generative Loss: -0.056 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1400 Generative Loss: 0.070  Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1450 Generative Loss: -0.063 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1500 Generative Loss: -0.068 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1550 Generative Loss: -0.095 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1600 Generative Loss: -0.076 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1650 Generative Loss: -0.073 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1700 Generative Loss: -0.002 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1750 Generative Loss: -0.066 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1800 Generative Loss: -0.069 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1850 Generative Loss: -0.073 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1900 Generative Loss: -0.069 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 1950 Generative Loss: -0.073 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2000 Generative Loss: -0.100 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Saving the model at iteration 2000 validation loss 81.64155062971444
Iteration: 2050 Generative Loss: -0.077 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2100 Generative Loss: -0.069 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2150 Generative Loss: -0.072 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2200 Generative Loss: -0.079 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2250 Generative Loss: -0.074 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2300 Generative Loss: -0.073 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2350 Generative Loss: -0.076 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2400 Generative Loss: -0.071 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2450 Generative Loss: -0.070 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2500 Generative Loss: -0.088 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2550 Generative Loss: -0.104 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2600 Generative Loss: -0.091 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2650 Generative Loss: -0.075 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2700 Generative Loss: -0.083 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2750 Generative Loss: -0.063 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2800 Generative Loss: -0.069 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2850 Generative Loss: -0.071 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2900 Generative Loss: -0.075 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 2950 Generative Loss: -0.072 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Iteration: 3000 Generative Loss: -0.100 Discriminative Cross Loss: 0.000        Discriminative Inner Loss: 0.000
Saving the model at iteration 3000 validation loss 85.59408001077587

As we can see, from iteration 750, the generative loss becomes negative, and validation loss ascendes.
How can explains this result?

Haven't found the definition of "injectnoise()"

Hi, I'm new to Julia and I noticed that injectnoise() was used without exactly defined.
All I found throughout the codes with the "injectnoise" are only two lines:L311,L325

function createTrainVal(region::SpatialRegion,
trjfile::String,
injectnoise::Function,
ntrain::Int,
nval::Int;
nsplit=5,
min_length=20,
max_length=100)
seq2str(seq) = join(map(string, seq), " ") * "\n"
h5open(trjfile, "r") do f
trainsrc, traintrg = open("../data/train.src", "w"), open("../data/train.trg", "w")
validsrc, validtrg = open("../data/val.src", "w"), open("../data/val.trg", "w")
for i = 1:ntrain+nval
trip = f["/trips/$i"] |> read
min_length <= size(trip, 2) <= max_length || continue
trg = trip2seq(region, trip) |> seq2str
noisetrips = injectnoise(trip, nsplit)

The code works, so I must have some misunderstanding there.
Could anyone tell me where I am wrong?

ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected sh ape was (256,) + inhomogeneous part.

Hello, boathit. I tried to follow your jupyter notebook but I encountered some problems in Exp1 In[12]. I hope to get your answer. When I progressed to "run(t2vec)", it reminded me to set vocab_size to 18866 rather than 18864. After I did this, it reminded me of the following errors:
Traceback (most recent call last):
File "/mnt/c/Users/HP/Desktop/Work/t2vec-master/t2vec.py", line 119, in
t2vec(args)
File "/mnt/c/Users/HP/Desktop/Work/t2vec-master/evaluate.py", line 94, in t2vec
src, lengths, invp = scaner.getbatch()
^^^^^^^^^^^^^^^^^
File "/mnt/c/Users/HP/Desktop/Work/t2vec-master/data_utils.py", line 287, in getbatch
return pad_arrays_keep_invp(src)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/c/Users/HP/Desktop/Work/t2vec-master/data_utils.py", line 84, in pad_arrays_keep_invp
src = list(np.array(src)[idx])
^^^^^^^^^^^^^
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected sh
ape was (256,) + inhomogeneous part.

"ERROR: LoadError: syntax: incomplete: premature end of input" when running porto2h5.jl

Using julia 1.2.0 and trying to run porto2h5.jl, I get
ERROR: LoadError: syntax: incomplete: premature end of input
with
Stacktrace:
[1] top-level scope at /home/ken/t2vec_time/preprocessing/porto2h5.jl:3
[2] eval at ./boot.jl:330 [inlined]
[3] eval(::Expr) at ./client.jl:432
[4] |>(::Expr, ::typeof(eval)) at ./operators.jl:854
[5] (::getfield(Main, Symbol("##3#4")))(::HDF5File) at /home/ken/t2vec_time/preprocessing/utils.jl:33
[6] #h5open#5(::Bool, ::typeof(h5open), ::getfield(Main, Symbol("##3#4")), ::String, ::Vararg{String,N} where N) at /home/ken/.julia/packages/HDF5/rF1Fe/src/HDF5.jl:683
[7] h5open at /home/ken/.julia/packages/HDF5/rF1Fe/src/HDF5.jl:681 [inlined]
[8] porto2h5(::String) at /home/ken/t2vec_time/preprocessing/utils.jl:30
[9] top-level scope at /home/ken/t2vec_time/preprocessing/porto2h5.jl:3
[10] include at ./boot.jl:328 [inlined]
[11] include_relative(::Module, ::String) at ./loading.jl:1094
[12] include(::Module, ::String) at ./Base.jl:31
[13] include(::String) at ./client.jl:431
[14] top-level scope at REPL[2]:1
in expression starting at /home/ken/t2vec_time/preprocessing/porto2h5.jl:3

performance evaluation script

hello, nice to meet you.
I have a trouble on evaluating your project.
when experiments/experiment.jl script on Github repo is run, there are some run-time issues.
I want to get the running Julia script that shows experiment 1), 2) and 3) for most similar trajectory search of section "Performance evaluation".
I don't need the evaluation for other methods.
Thanks for your help.

Problems for training

The environment is:
Python 3.6 and PyTorch 1.5.

when i run the code,it occurs the following problem. I don't know why?

XXX$CUDA_VISIBLE_DEVICES=1 python t2vec.py -vocab_size 18864 -criterion_name "KLDIV" -knearestvocabs "porto-vocab-dist-cell100.h5"

Namespace(batch=128, bidirectional=True, bucketsize=[(20, 30), (30, 30), (30, 50), (50, 50), (50, 70), (70, 70), (70, 100), (100, 100)], checkpoint='/home/ucas/jingquanliang/SimSub/data/checkpoint.pt', criterion_name='KLDIV', cuda=True, data='/home/ucas/jingquanliang/SimSub/data', discriminative_w=0.1, dist_decay_speed=0.8, dropout=0.2, embedding_size=256, epochs=15, generator_batch=32, hidden_size=256, knearestvocabs='porto-vocab-dist-cell100.h5', learning_rate=0.001, max_grad_norm=5.0, max_length=200, max_num_line=20000000, mode=0, num_layers=3, prefix='exp', pretrained_embedding=None, print_freq=50, save_freq=1000, start_iteration=0, t2vec_batch=256, use_discriminative=False, vocab_size=18864)
Reading training data...
Read line 500000
Read line 1000000
Read line 1500000
Read line 2000000
Read line 2500000
Read line 3000000
Read line 3500000
Read line 4000000
Read line 4500000
Read line 5000000
Read line 5500000
Read line 6000000
Read line 6500000
Read line 7000000
Read line 7500000
Read line 8000000
Read line 8500000
Read line 9000000
Read line 9500000
Read line 10000000
Read line 10500000
Read line 11000000
Read line 11500000
Read line 12000000
Read line 12500000
Read line 13000000
Read line 13500000
Read line 14000000
Read line 14500000
Read line 15000000
Read line 15500000
Read line 16000000
Read line 16500000
/home/ucas/jingquanliang/SimSub/ori-t2vec/data_utils.py:155: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
self.srcdata = list(map(np.array, self.srcdata))
/home/ucas/jingquanliang/SimSub/ori-t2vec/data_utils.py:156: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
self.trgdata = list(map(np.array, self.trgdata))
Allocation: [5452031, 1661741, 4455989, 2563116, 1595830, 647973, 247370, 87910]
Percent: [0.32623528 0.09943424 0.26663473 0.15337016 0.0954903 0.03877301
0.01480197 0.0052603 ]
Reading validation data...
/home/ucas/jingquanliang/SimSub/ori-t2vec/data_utils.py:148: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
self.srcdata = np.array(merge(*self.srcdata))
/home/ucas/jingquanliang/SimSub/ori-t2vec/data_utils.py:149: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
self.trgdata = np.array(merge(*self.trgdata))
Loaded validation data size 167520
Loading vocab distance file /home/ucas/jingquanliang/SimSub/data/porto-vocab-dist-cell100.h5...
=> training with GPU
=> no checkpoint found at '/home/ucas/jingquanliang/SimSub/data/checkpoint.pt'
Iteration starts at 0 and will end at 66999
/home/ucas/jingquanliang/SimSub/ori-t2vec/data_utils.py:46: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
src = list(np.array(src)[idx])
/home/ucas/jingquanliang/SimSub/ori-t2vec/data_utils.py:47: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray
trg = list(np.array(trg)[idx])
Iteration: 0 Generative Loss: 5.399 Discriminative Cross Loss: 0.000 Discriminative Inner Loss: 0.000
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [0,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [1,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [2,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [3,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [4,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [5,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [6,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [7,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [8,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [9,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [10,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [11,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [12,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [13,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [14,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [15,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [16,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [17,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [18,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [19,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [20,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [21,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [22,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [23,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [24,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [25,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [26,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [27,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [28,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [29,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [30,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [31,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [32,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [33,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [34,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [35,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [36,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [37,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [38,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [39,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [40,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [41,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [42,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [43,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [44,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [45,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [46,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [47,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [48,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [49,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [50,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [51,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [52,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [53,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [54,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [55,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [56,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [57,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [58,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [59,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [60,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [61,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [62,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [63,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [64,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [65,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [66,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [67,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [68,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [69,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [70,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [71,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [72,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [73,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [74,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [75,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [76,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [77,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [78,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [79,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [80,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [81,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [82,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [83,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [84,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [85,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [86,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [87,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [88,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [89,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [90,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [91,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [92,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [93,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [94,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [95,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [96,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [97,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [98,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [99,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [100,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [101,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [102,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [103,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [104,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [105,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [106,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [107,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [108,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [109,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [110,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [111,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [112,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [113,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [114,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [115,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [116,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [117,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [118,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [119,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [120,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [121,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [122,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [123,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [124,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [125,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [126,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [535,0,0], thread: [127,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [0,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [1,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [2,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [3,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [4,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [5,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [6,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [7,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [8,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [9,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [10,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [11,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [12,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [13,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [14,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [15,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [16,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [17,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [18,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [19,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [20,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [21,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [22,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [23,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [24,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [25,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [26,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [27,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [28,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [29,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [30,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [31,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [32,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [33,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [34,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [35,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [36,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [37,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [38,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [39,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [40,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [41,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [42,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [43,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [44,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [45,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [46,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [47,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [48,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [49,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [50,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [51,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [52,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [53,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [54,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [55,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [56,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [57,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [58,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [59,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [60,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [61,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [62,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [63,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [64,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [65,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [66,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [67,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [68,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [69,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [70,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [71,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [72,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [73,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [74,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [75,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [76,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [77,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [78,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [79,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [80,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [81,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [82,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [83,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [84,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [85,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [86,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [87,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [88,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [89,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [90,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [91,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [92,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [93,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [94,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [95,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [96,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [97,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [98,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [99,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [100,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [101,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [102,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [103,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [104,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [105,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [106,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [107,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [108,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [109,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [110,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [111,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [112,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [113,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [114,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [115,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [116,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [117,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [118,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [119,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [120,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [121,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [122,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [123,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [124,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [125,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [126,0,0] Assertion srcIndex < srcSelectDimSize failed.
/opt/conda/conda-bld/pytorch_1591914838379/work/aten/src/THC/THCTensorIndex.cu:272: indexSelectLargeIndex: block: [534,0,0], thread: [127,0,0] Assertion srcIndex < srcSelectDimSize failed.
Traceback (most recent call last):
File "t2vec.py", line 121, in
train(args)
File "/home/ucas/jingquanliang/SimSub/ori-t2vec/train.py", line 262, in train
genloss = genLoss(gendata, m0, m1, lossF, args)
File "/home/ucas/jingquanliang/SimSub/ori-t2vec/train.py", line 90, in genLoss
output = m0(input, lengths, target)
File "/home/ucas/anaconda3/envs/jqlsim/lib/python3.6/site-packages/torch/nn/modules/module.py", line 550, in call
result = self.forward(*input, **kwargs)
File "/home/ucas/jingquanliang/SimSub/ori-t2vec/models.py", line 207, in forward
encoder_hn, H = self.encoder(src, lengths)
File "/home/ucas/anaconda3/envs/jqlsim/lib/python3.6/site-packages/torch/nn/modules/module.py", line 550, in call
result = self.forward(*input, **kwargs)
File "/home/ucas/jingquanliang/SimSub/ori-t2vec/models.py", line 111, in forward
lengths = lengths.data.view(-1).tolist()
RuntimeError: CUDA error: device-side assert triggered

Training the model with varying the cell size

Hello,boathit my name is Atheir
I'm trying to train the model with cells size 25 ,40 and 150

and I'm facing this error:
Traceback (most recent call last):
File "t2vec.py", line 115, in
train(args)
File "/home//t2vec/train.py", line 225, in train
loss = batchloss(output, target, m1, lossF, args.generator_batch)
File "/home/jahdalay/t2vec/train.py", line 89, in batchloss
loss += lossF(o, t)
File "/home/
/t2vec/train.py", line 168, in
lossF = lambda o, t: KLDIVloss(o, t, criterion, V, D)
File "/home/
/t2vec/train.py", line 42, in KLDIVloss
return criterion(outputk, targetk)
File "/home/
*/pytorch-0.1.12/lib/python3.6/site-packages/torch/nn/modules/module.py", line 206, in call
result = self.forward(*input, kwargs)
File "/home//pytorch-0.1.12/lib/python3.6/site-packages/torch/nn/modules/loss.py", line 36, in forward
return backend_fn(self.size_average, weight=self.weight)(input, target)
File "/home/
/pytorch-0.1.12/lib/python3.6/site-packages/torch/nn/_functions/thnn/auto.py", line 41, in forward
output, *self.additional_args)
RuntimeError: cudaEventCreateWithFlags in future ctor: device-side assert triggered

Could you please help me
Thank you very much
Best,

preprocess.jl for customized dataset

I could once run this project with porto dataset. In the next step, I wanted to run for my dataset which is stored in a numpy array with size (128184 , 300, 3). I create a new function to make the .h5 file for my dataset:

function makeh5(npyfile::String)

df = npzread(npyfile) # |> DataFrame

println("Processing $(size(df, 1)) trips...")

## writing in numeric matrix with hdf5

h5open("../data/geo.h5", "w") do f

    num, num_incompleted = 0, 0

    for idx in 1:size(df,1)

        tripLength = size(df[idx,:,2:3],1)

        num += 1

        vect = df[idx,:,2:3]

        m,n = size(vect)

        f["/trips/$num"] = permutedims(vect)

        f["/timestamps/$num"] = collect(0:tripLength-1) * 15.0

    end

      attributes(f)["num"] = num

    println("Saved $num trips.")

end

end
The output of this function is:

Processing 128184 trips...
Saved 128184 trips.

Then I wanted to run the preprocess.jl file. There I only changed the following line:
createTrainVal(region, "$datapath/$cityname.h5", datapath, downsamplingDistort, 100_000, 10_000)
And of course I changed the hyperparameter file to be in line with my data. But when I run the preprocess.jl I get below error:

Building spatial region with:
cityname=geo,
minlon=22.537783,
minlat=-122.134211,
maxlon=47.63208,
maxlat=121.521455,
xstep=100.0,
ystep=100.0,
minfreq=100
Creating paramter file /home/nasrim/data/t2vec/data/geo-param-cell100
Processed 100000 trips
ERROR: LoadError: BoundsError: attempt to access 0-element Array{Pair{Int64,Int64},1} at index [0]
Stacktrace:
[1] getindex(::Array{Pair{Int64,Int64},1}, ::Int64) at ./array.jl:809
[2] makeVocab!(::SpatialRegion, ::String) at /data/nasrim/t2vec/preprocessing/SpatialRegionTools.jl:205
[3] top-level scope at /data/nasrim/t2vec/preprocessing/preprocess.jl:53
[4] include(::Function, ::Module, ::String) at ./Base.jl:380
[5] include(::Module, ::String) at ./Base.jl:368
[6] exec_options(::Base.JLOptions) at ./client.jl:296
[7] _start() at ./client.jl:506
in expression starting at /data/nasrim/t2vec/preprocessing/preprocess.jl:48

I'm struggling a lot to make this work. Could you please help me to solve this error. I really apperciate your help.

Harbin data set from original paper

Hello boathit,

In the original paper you use two data sets - one from Porto and one from Harbin. I have been doing some experiments with trajectories from Porto, but I wanted to see whether the experiments will be consistent on another corpora - Harbin.

Could you please point me to a link where the Harbin trajectories are uploaded?

Best regards,
Zrrr

where is the part of utilizing skip-gram idea to learn cell representation?

Hello, boathit. I am new to Julia and trajectory representation learning.
Firstly, After reading the paper, I think the first part is to transform trajectories into cells and learn a cell representation by utilizing the idea of skip-gram, but I couldn't find it in your source code. Is it finished in julia codes or elsewhere?
Second,Since the Julia 0.6.4 is not maintained any more, I wish you could update your code to latest version for us to learn. Thank you!

Creating test data for similarity measure

Hi,

I need to run the experiment for the self and similarity measure. I could not find these experiments in the Jupyter notebook.
I first need to create the Ta and Tb in the following lines:

## self-similarity
Tb = h5open("trj.h5", "r") do f
    read(f["layer3"])
end
Ta = h5open("trj-distort06.h5", "r") do f
    read(f["layer3"])
end

d = colwise(Euclidean(), Ta, Tb)
mean(d)

### cross-similarity
da = colwise(Euclidean(), Ta, Ta[:, end:-1:1])
db = colwise(Euclidean(), Tb, Tb[:, end:-1:1])
mean(abs(db - da) ./ db)

For that mean, I'm using the following piece of code to create Ta (after applying t2vec to vecfile):

## Creating test files for similarity computation.
prefix = "exp1"
do_split = true
start = 1_000_000+20_000
num_query = 1000
num_db = 100_000
querydbfile = joinpath(datapath, "$prefix-querydb.h5")
tfile = joinpath(datapath, "$prefix-trj.t")
labelfile = joinpath(datapath, "$prefix-trj.label")
vecfile = joinpath(datapath, "$prefix-trj.h5")

createQueryDB("$datapath/$cityname.h5", start, num_query, num_db,
               (x, y)->(x, y),
               (x, y)->(x, y);
               do_split=do_split,
               querydbfile=querydbfile)
createTLabel(region, querydbfile; tfile=tfile, labelfile=labelfile)

But still, I need to create 'Tb'. COuld you please help me to create this dataset?

Problems on "yoffset * region.numx + xoffset" in SpatialRegionTools.jl, 120

"""
Web Mercator coordinate to cell id
"""
function coord2cell(region::SpatialRegion, x::Float64, y::Float64)
xoffset = round(x - region.minx, digits=6) / region.xstep
yoffset = round(y - region.miny, digits=6) / region.ystep
xoffset = convert(Int, floor(xoffset))
yoffset = convert(Int, floor(yoffset))
yoffset * region.numx + xoffset
end
Hi @boathit , in your preprocessing part in file SpatialRegionTools.jl line 120, I feel confused when you project coordinates into cells here. From my understanding, every cell we defined should be assigned an
id, which could be from 0 to vocab_size-1. But here the id equals yoffset * region.numx + xoffset. Why is this? Thank you.

problem:julia preprocess.jl

hello, when i run:julia preprocess.jl
ERROR: LoadError: AssertionError: Build index for region first
Stacktrace:
[1] cell2vocab at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/SpatialRegionTools.jl:261 [inlined]
[2] gps2vocab(::SpatialRegion, ::Float64, ::Float64) at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/SpatialRegionTools.jl:281
[3] trip2seq(::SpatialRegion, ::Array{Float64,2}) at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/SpatialRegionTools.jl:288
[4] (::getfield(Main, Symbol("##21#23")){Int64,Int64,Int64,SpatialRegion,typeof(downsamplingDistort),Int64,Int64,getfield(Main, Symbol("#seq2str#22"))})(::HDF5File) at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/SpatialRegionTools.jl:324
[5] #h5open#5(::Bool, ::Function, ::getfield(Main, Symbol("##21#23")){Int64,Int64,Int64,SpatialRegion,typeof(downsamplingDistort),Int64,Int64,getfield(Main, Symbol("#seq2str#22"))}, ::String, ::Vararg{String,N} where N) at /Users/huangrui/.julia/packages/HDF5/Zh9on/src/HDF5.jl:683
[6] h5open at /Users/huangrui/.julia/packages/HDF5/Zh9on/src/HDF5.jl:681 [inlined]
[7] #createTrainVal#20(::Int64, ::Int64, ::Int64, ::Function, ::SpatialRegion, ::String, ::typeof(downsamplingDistort), ::Int64, ::Int64) at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/SpatialRegionTools.jl:318
[8] createTrainVal(::SpatialRegion, ::String, ::Function, ::Int64, ::Int64) at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/SpatialRegionTools.jl:317
[9] top-level scope at none:0
[10] include at ./boot.jl:317 [inlined]
[11] include_relative(::Module, ::String) at ./loading.jl:1044
[12] include(::Module, ::String) at ./sysimg.jl:29
[13] exec_options(::Base.JLOptions) at ./client.jl:266
[14] _start() at ./client.jl:425
in expression starting at /Users/huangrui/app/python_projects/t2vec-master/preprocessing/preprocess.jl:55

the version of julia is 1.0.5

The purpose of bucket when training

Hi,

Thanks for your insight paper and code!

I am wondering why use different bucket to generate the train data according to the probility of each bucket.

And how to determine the size of each bucket?

## select bucket

Thanks a lot!

probleam: julia preprocess.jl

when i run:julia preprocess.jl:

HDF5-DIAG: Error detected in HDF5 (1.12.0) thread 0:
#000: H5F.c line 793 in H5Fopen(): unable to open file
major: File accessibility
minor: Unable to open file
#1: H5VLcallback.c line 3500 in H5VL_file_open(): open failed
major: Virtual Object Layer
minor: Can't open object
#2: H5VLcallback.c line 3465 in H5VL__file_open(): open failed
major: Virtual Object Layer
minor: Can't open object
#3: H5VLnative_file.c line 100 in H5VL__native_file_open(): unable to open file
major: File accessibility
minor: Unable to open file
#4: H5Fint.c line 1707 in H5F_open(): unable to read superblock
major: File accessibility
minor: Read failed
#5: H5Fsuper.c line 621 in H5F__super_read(): truncated file: eof = 96, sblock->base_addr = 0, stored_eof = 2048
major: File accessibility
minor: File has been truncated
ERROR: LoadError: Error opening file /home/ccit19/ljh/t2vec/data/porto.h5
Stacktrace:
[1] error(::String, ::String)
@ Base ./error.jl:42
[2] h5f_open(pathname::String, flags::UInt16, fapl_id::HDF5.Properties)
@ HDF5 ~/.julia/packages/HDF5/VJkAi/src/api.jl:614
[3] h5open(filename::String, mode::String; swmr::Bool, pv::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ HDF5 ~/.julia/packages/HDF5/VJkAi/src/HDF5.jl:436
[4] h5open(::var"#4#9"{SpatialRegion}, ::String, ::Vararg{String, N} where N; swmr::Bool, pv::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ HDF5 ~/.julia/packages/HDF5/VJkAi/src/HDF5.jl:456
[5] h5open
@ ~/.julia/packages/HDF5/VJkAi/src/HDF5.jl:456 [inlined]
[6] makeVocab!(region::SpatialRegion, trjfile::String)
@ Main ~/ljh/t2vec/preprocessing/SpatialRegionTools.jl:185
[7] top-level scope
@ ~/ljh/t2vec/preprocessing/preprocess.jl:55
in expression starting at /home/ccit19/ljh/t2vec/preprocessing/preprocess.jl:50

why using h5open() error?

vocab_size in preprocessing

Dear @boathit
i do have issues when i preprocessing the data at 25 and the cus me a different result
when i do the pre the vocab size =40004 not 60004 as its shown in the paper
and that give a really bad result

please advise
thank you very much
Best,

about cell representation

Thank you for the source code!But I can't find anything about cell representation pre-training in train.py, can you point me to it?

RLE in trip2seq

Hi boathit,

Thanks for this inspirational work!

I notice that you adopt RLE (Running Length Encoding) instead of the full series of tokens as the embedding of one trajectory., which you didn't mention in your paper. As far as I know, this is not a traditional way to compress trajectory data. If you just input the single values without counts how can you retrieve the trajectory from it? This compression may cause some information loss during training. But I compared model_rle with model_full, model_full accidentally outperforms model_rle on three tasks, especially self-similarity : )

In addition, by adopting RLE on cell token, the length of the trajectory is reduced by a large margin. When I tried to apply t2vec on other datasets which have longer trajectories, if I use full trajectory instead of RLE, I couldn't handle trajectories longer than 400 points with default model parameters (I use a GPU-server with 24GB Graphic Memory).

I hope this repo is still active : ) Thanks in advance.

code for pretraining embeddings

hello,
As far as I can tell the code for pretraining embeddings is not in the repo. Would it be possible to describe or point to the implementation?

Mapping from porto.h5 to trj.h5

Dear @boathit,

I am currently trying to find out how the raw GPS dataset porto.h5 can be mapped to the vector representation trj.h5.

porto.h5 contains 1 704 759 raw GPS-trajectories, whereas trj.t stores only 101 000 sequences of hot cells. The vector representation in trj.h5 also hold 101 000 values, leading me to believe they are the corresponding embeddings from trj.t.

Is there some way to find out which trajectories from porto.h5 correspond to the ones in trj.t? (For example: the first 101 000 from the 1 704 759.)

Thank you in advance,
ZrrrKIT

Embedding with vocab size out of bounds

Hi,

When I upgrade pytorch to the lastest version, I got the error when get vocab from decoder:

File "/home/d2tec/train.py", line 51, in KLDIVloss
    outputk = torch.gather(output, 1, indices)
RuntimeError: index 18864 is out of bounds for dimension 1 with size 18864

Since the vocab_size in network is 18864, when using index_select to select topk ID from target, it is possibe to select ID = 18864/18865, but the output just has 18864 demension, so the problem arises.

t2vec/train.py

Line 31 in f518c3e

def KLDIVloss(output, target, criterion, V, D):

I am really confused about the error, because it only appeard when I upgrade pytorch.

I dont know whether I understand correctly, hope to see your reply. A lot thanks!

ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions.

Hello, boathit, I am a beginner in trajectory mining. When I was training, I encountered a problem that could not be solved. I hope to get your answer. The problem is :
Traceback (most recent call last):
File "t2vec.py", line 121, in
train(args)
File "/home/ubuntu/gsh/t2vec-master/train.py", line 177, in train
trainData.load(args.max_num_line)
File "/home/ubuntu/gsh/t2vec-master/data_utils.py", line 155, in load
self.srcdata = list(map(np.array, self.srcdata))
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (5453003,) + inhomogeneous part.

Problems running the program

Dear @boathit,
我在运行程序到experiment的 “julia createTest.jl”,出现一个错误:
ERROR: LoadError: LoadError: UndefVarError: T not defined
...
in expression starting at /home/workplace/t2vec-master/t2vec-maste r/experiment/utils.jl:160
...
这个错误出现的定位在这个函数 ranksearch{T<:Real}(query::Matrix{T},
queryLabel::Vector{Int},
db::Matrix{T},
dbLabel::Vector{Int})
我去掉了{T<:Real}时,出现了

ERROR: LoadError: UndefVarError: jldopen not defined

在SpatialRegionTools中的function loadregion中
jldopen(paramfile, "r") do f
...

我认为是环境配置错误,但是我尝试将julia从1.0.0到1.0.5到1.1.0,和目前1.3.0尝试的过程中都出现这个错误,我对于Julia不是很熟,我想请问一下您这是什么问题。非常感谢您在百忙中给我回复。

Thank you in advance,

about process.jl

Brother Li! Hi Thank you very much for your help. When I reproduced your experiment, I found that your preprocessing was written in julia language. It's more difficult for me to read. Excuse me
What do the three data sets of train.mta, train.src, train, and trg represent? And according to the idea of the paper, is the model trained to be a model that can turn low-quality trajectory point information into high-quality trajectory point information? ? Thank you very much!

Problem when training

Hi,

I could successfully run the data generation and preprocessing code. I faced the below error when training the model:
RuntimeError: CUDA error: device-side assert triggered

Then I realized that I can solve this error by running on CPU rather than CPU. Then I faced the below error:
Traceback (most recent call last):
File "t2vec.py", line 121, in
train(args)
File "/data/nasrim/t2vec/train.py", line 260, in train
genloss = genLoss(gendata, m0, m1, lossF, args)
File "/data/nasrim/t2vec/train.py", line 103, in genLoss
loss += lossF(o, t)
File "/data/nasrim/t2vec/train.py", line 207, in
lossF = lambda o, t: KLDIVloss(o, t, criterion, V, D)
File "/data/nasrim/t2vec/train.py", line 42, in KLDIVloss
outputk = torch.gather(output, 1, indices)
RuntimeError: Invalid index in gather at /pytorch/aten/src/TH/generic/THTensorEvenMoreMath.cpp:472

It seems that the index is wrong? Could you please help me to solve this issue?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.