Giter Club home page Giter Club logo

perslay's Issues

Extended persistence implementation

Dear Mathieu,

Can you give a sense of Perslay implementation of extended persistence as it differs from Cohen-Steiner paper you cite?

You are forming the extended persistence matrix, not as block matrix of [[A, P], [0, D]], but implicitly adding dummy vertex [-2] to each vertex and edge of the graph, thus forming 1- and 2-simplices, effectively coning a graph with that dummy vertex.

Could you please tell the idea of that? What is the size of the obtained extended persistence matrix? Is it 2n x 2n or maybe 2n x 2n + 1? How do you parse "distorted diagram" variable to get Ord, Rel and Ext diagrams?

error on Google Colab for tutorialPersLay.ipynb

This line:

_, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

results in the following error message on Google Colab notebook for tutorialPersLay.ipynb with tensorflow==2.4.0 installed:

ValueError: Input 0 of layer sequential_5 is incompatible with the layer: expected axis -1 of input shape to have value 16039 but received input with shape (None, 439)

---------------------------------------------------------------------------

ValueError                                Traceback (most recent call last)

<ipython-input-42-7dbdefe827e4> in <module>()
----> 1 _, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

10 frames

/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
    975           except Exception as e:  # pylint:disable=broad-except
    976             if hasattr(e, "ag_error_metadata"):
--> 977               raise e.ag_error_metadata.to_exception(e)
    978             else:
    979               raise

ValueError: in user code:

    /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py:805 train_function  *
        return step_function(self, iterator)
    /usr/local/lib/python3.7/dist-packages/perslay/perslay.py:258 call  *
        final_representations = self.rho(concat_representations) if self.rho != "identity" else concat_representations
    /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/base_layer.py:998 __call__  **
        input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
    /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/input_spec.py:259 assert_input_compatibility
        ' but received input with shape ' + display_shape(x.shape))

    ValueError: Input 0 of layer sequential_5 is incompatible with the layer: expected axis -1 of input shape to have value 16039 but received input with shape (None, 439)

error during the training in tutorialPersLay.ipynb

Hi, @MathieuCarriere ,

When I try your updated version (tf2.0) of perslay, I got the following error during the training in tutorialPersLay.ipynb.
The cell which got the error is this one:
_, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

The error is the following. Could you give some hints to solve this issue? Thanks !

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-59-7dbdefe827e4> in <module>
----> 1 _, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

/media/root/mdata/data/code13/perslay/tutorial/experiments.py in evaluate_model(L, F, D, train_sub, test_sub, model, optimizer, loss, metrics, num_epochs, batch_size, verbose, plots)
    380 
    381     model.compile(loss=loss, optimizer=optimizer, metrics=metrics)
--> 382     history = model.fit(x=[diags_train, feats_train], y=label_train, validation_data=([diags_test, feats_test], label_test), epochs=num_epochs, batch_size=batch_size, shuffle=True, verbose=verbose)
    383     train_results = model.evaluate([diags_train, feats_train], label_train, verbose=verbose)
    384     test_results = model.evaluate([diags_test,  feats_test],  label_test, verbose=verbose)

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_freq, max_queue_size, workers, use_multiprocessing, **kwargs)
    726         max_queue_size=max_queue_size,
    727         workers=workers,
--> 728         use_multiprocessing=use_multiprocessing)
    729 
    730   def evaluate(self,

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_v2.py in fit(self, model, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_freq, **kwargs)
    222           validation_data=validation_data,
    223           validation_steps=validation_steps,
--> 224           distribution_strategy=strategy)
    225 
    226       total_samples = _get_total_number_of_samples(training_data_adapter)

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_v2.py in _process_training_inputs(model, x, y, batch_size, epochs, sample_weights, class_weights, steps_per_epoch, validation_split, validation_data, validation_steps, shuffle, distribution_strategy, max_queue_size, workers, use_multiprocessing)
    545         max_queue_size=max_queue_size,
    546         workers=workers,
--> 547         use_multiprocessing=use_multiprocessing)
    548     val_adapter = None
    549     if validation_data:

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_v2.py in _process_inputs(model, x, y, batch_size, epochs, sample_weights, class_weights, shuffle, steps, distribution_strategy, max_queue_size, workers, use_multiprocessing)
    592         batch_size=batch_size,
    593         check_steps=False,
--> 594         steps=steps)
    595   adapter = adapter_cls(
    596       x,

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in _standardize_user_data(self, x, y, sample_weight, class_weight, batch_size, check_steps, steps_name, steps, validation_split, shuffle, extract_tensors_from_dataset)
   2417     # First, we build the model on the fly if necessary.
   2418     if not self.inputs:
-> 2419       all_inputs, y_input, dict_inputs = self._build_model_with_inputs(x, y)
   2420       is_build_called = True
   2421     else:

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in _build_model_with_inputs(self, inputs, targets)
   2580     # or lists of arrays, and extract a flat list of inputs from the passed
   2581     # structure.
-> 2582     training_utils.validate_input_types(inputs, orig_inputs)
   2583 
   2584     if isinstance(inputs, (list, tuple)):

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_utils.py in validate_input_types(inp, orig_inp, allow_dict, field_name)
   1141       raise ValueError(
   1142           'Please provide as model inputs either a single array or a list of '
-> 1143           'arrays. You passed: {}={}'.format(field_name, str(orig_inp)))
   1144   elif isinstance(inp, dict):
   1145     if not allow_dict:

ValueError: Please provide as model inputs either a single array or a list of arrays. You passed: inputs=[[array([[[0.04096222, 0.6492617 , 1.        ]],

       [[0.02979578, 0.6456197 , 1.        ]],

       [[0.10624982, 0.70130676, 1.        ]],

       [[0.09423445, 0.68579924, 1.        ]],

       [[0.08765721, 0.65461326, 1.        ]],

       [[0.07269918, 0.6840143 , 1.        ]],

       [[0.0482158 , 0.6551768 , 1.        ]],

       [[0.16572776, 0.7184999 , 1.        ]],

       [[0.13522379, 0.7022206 , 1.        ]],

       [[0.09423427, 0.6857996 , 1.        ]],

       [[0.08149477, 0.6545724 , 1.        ]],

       [[0.04433941, 0.6531972 , 1.        ]],

       [[0.09321079, 0.69957846, 1.        ]],

       [[0.13965023, 0.7184939 , 1.        ]],

       [[0.07372162, 0.6409182 , 1.        ]],

       [[0.04427183, 0.70080274, 1.        ]],

       [[0.10328357, 0.6514773 , 1.        ]],

       [[0.05908684, 0.7028342 , 1.        ]],

       [[0.13522449, 0.7022229 , 1.        ]],

       [[0.04618937, 0.68428135, 1.        ]],

       [[0.08930153, 0.6993657 , 1.        ]],

       [[0.1515532 , 0.6633823 , 1.        ]],

       [[0.16572818, 0.7185017 , 1.        ]],

       [[0.09329209, 0.7022637 , 1.        ]],

       [[0.04639818, 0.6843244 , 1.        ]],

       [[0.06401373, 0.64134747, 1.        ]],

       [[0.09095395, 0.6480904 , 1.        ]],

       [[0.10625249, 0.70130414, 1.        ]],

       [[0.12966095, 0.6479485 , 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.10483469, 0.66629255, 1.        ]],

       [[0.16577075, 0.71886766, 1.        ]],

       [[0.09266786, 0.70124847, 1.        ]],

       [[0.10512511, 0.7184938 , 1.        ]],

       [[0.07416126, 0.6843585 , 1.        ]],

       [[0.10509352, 0.70854795, 1.        ]],

       [[0.00336864, 0.69990474, 1.        ]],

       [[0.04863785, 0.66084623, 1.        ]],

       [[0.0402654 , 0.66950077, 1.        ]],

       [[0.00314568, 0.6446781 , 1.        ]],

       [[0.0768472 , 0.7088008 , 1.        ]],

       [[0.15155277, 0.66338074, 1.        ]],

       [[0.0768472 , 0.7088008 , 1.        ]],

       [[0.03819996, 0.69101375, 1.        ]],

       [[0.05124237, 0.64191353, 1.        ]],

       [[0.14820483, 0.70284945, 1.        ]],

       [[0.05176072, 0.7186711 , 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.04863784, 0.6840446 , 1.        ]],

       [[0.06863144, 0.6390971 , 1.        ]],

       [[0.157291  , 0.700788  , 1.        ]],

       [[0.06335098, 0.6808842 , 1.        ]],

       [[0.06218416, 0.6615998 , 1.        ]],

       [[0.05393252, 0.65079397, 1.        ]],

       [[0.15131031, 0.6618089 , 1.        ]],

       [[0.02778925, 0.6551692 , 1.        ]],

       [[0.07684776, 0.70164394, 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.11811654, 0.7184362 , 1.        ]],

       [[0.09408505, 0.6857985 , 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.09408478, 0.68579805, 1.        ]],

       [[0.05208734, 0.70160836, 1.        ]],

       [[0.04816426, 0.65515417, 1.        ]],

       [[0.0494085 , 0.6470727 , 1.        ]],

       [[0.09179149, 0.877014  , 1.        ]],

       [[0.09347136, 0.6552795 , 1.        ]],

       [[0.16576958, 0.71886665, 1.        ]],

       [[0.13413991, 0.65614253, 1.        ]],

       [[0.15326568, 0.71949303, 1.        ]],

       [[0.15155329, 0.6633824 , 1.        ]],

       [[0.00562564, 0.6999543 , 1.        ]],

       [[0.08739629, 0.6546074 , 1.        ]],

       [[0.06784848, 0.7016517 , 1.        ]],

       [[0.1659189 , 0.71990293, 1.        ]],

       [[0.06543359, 1.        , 1.        ]],

       [[0.13146053, 0.70120275, 1.        ]],

       [[0.05716776, 0.6871406 , 1.        ]],

       [[0.0571343 , 0.6839328 , 1.        ]],

       [[0.        , 0.6381614 , 1.        ]],

       [[0.09843671, 0.64788634, 1.        ]],

       [[0.07215892, 0.6551485 , 1.        ]],

       [[0.07627445, 0.66097206, 1.        ]],

       [[0.14998108, 0.6557486 , 1.        ]],

       [[0.09843668, 0.6478921 , 1.        ]],

       [[0.0460642 , 0.63841474, 1.        ]],

       [[0.07177885, 0.6551704 , 1.        ]],

       [[0.08518489, 0.6551498 , 1.        ]],

       [[0.11245405, 0.70022535, 1.        ]],

       [[0.08245772, 0.7016489 , 1.        ]],

       [[0.11314984, 0.7002235 , 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.13016728, 0.6568459 , 1.        ]],

       [[0.11789479, 0.72237647, 1.        ]],

       [[0.153811  , 0.6747876 , 1.        ]],

       [[0.13413991, 0.65614253, 1.        ]],

       [[0.09341697, 0.7012985 , 1.        ]],

       [[0.0660955 , 0.7184495 , 1.        ]],

       [[0.07365225, 0.6551383 , 1.        ]],

       [[0.08253633, 0.6435734 , 1.        ]],

       [[0.10481098, 0.7003628 , 1.        ]],

       [[0.03335524, 0.6551699 , 1.        ]],

       [[0.06791646, 0.6405793 , 1.        ]],

       [[0.06609582, 0.71844697, 1.        ]],

       [[0.05173067, 0.71949786, 1.        ]],

       [[0.07379147, 0.69480264, 1.        ]],

       [[0.11711456, 0.70855874, 1.        ]],

       [[0.10172272, 0.7237017 , 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.07358661, 0.6408898 , 1.        ]],

       [[0.1482031 , 0.7028453 , 1.        ]],

       [[0.04816353, 0.6551557 , 1.        ]],

       [[0.09356514, 0.71849257, 1.        ]],

       [[0.06335098, 0.6808842 , 1.        ]],

       [[0.09269566, 0.7085407 , 1.        ]],

       [[0.15451556, 0.6787512 , 1.        ]],

       [[0.09674594, 0.6551816 , 1.        ]],

       [[0.00115718, 0.6836316 , 1.        ]],

       [[0.09423445, 0.68579924, 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.09084272, 0.70164406, 1.        ]],

       [[0.10474764, 0.685521  , 1.        ]],

       [[0.0703817 , 0.70198435, 1.        ]],

       [[0.09000266, 0.6631299 , 1.        ]],

       [[0.12282047, 0.6486707 , 1.        ]],

       [[0.15451601, 0.6787521 , 1.        ]],

       [[0.09719884, 0.6849833 , 1.        ]],

       [[0.11712634, 0.7184987 , 1.        ]],

       [[0.06335098, 0.6808842 , 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.11721653, 0.7084044 , 1.        ]]], dtype=float32), array([[[0.04530944, 0.4153706 , 1.        ],
        [0.10001399, 0.39944366, 1.        ],
        [0.        , 0.21005177, 1.        ],
        ...,
        [0.31145117, 0.44218075, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14051671, 0.4857567 , 1.        ],
        [0.09360856, 0.35311696, 1.        ],
        [0.11756343, 0.35479027, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.13374329, 0.57005185, 1.        ],
        [0.11520837, 0.4339289 , 1.        ],
        [0.283554  , 0.43663704, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       ...,

       [[0.30671087, 0.87066615, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.18924364, 0.59529954, 1.        ],
        [0.37977156, 0.59529954, 1.        ],
        [0.        , 0.        , 0.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.1641429 , 0.6290723 , 1.        ],
        [0.13456053, 0.56243235, 1.        ],
        [0.        , 0.        , 0.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]]], dtype=float32), array([[[0.13427994, 0.64329606, 1.        ],
        [0.13427994, 0.64329606, 1.        ],
        [0.08229115, 0.21664964, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.13341546, 0.63959205, 1.        ],
        [0.13341546, 0.63959205, 1.        ],
        [0.0785995 , 0.30460635, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14788273, 0.6962263 , 1.        ],
        [0.14788273, 0.6962263 , 1.        ],
        [0.10173036, 0.21163371, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       ...,

       [[0.14069043, 0.6754564 , 1.        ],
        [0.14069055, 0.6754564 , 1.        ],
        [0.08124373, 0.52644134, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14809799, 0.69779295, 1.        ],
        [0.14809805, 0.69779295, 1.        ],
        [0.1637246 , 0.36651218, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14917274, 0.7034447 , 1.        ],
        [0.14917292, 0.7034447 , 1.        ],
        [0.14768288, 0.6951592 , 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]]], dtype=float32), array([[[0.0308405 , 0.35125092, 1.        ],
        [0.1389144 , 0.25848013, 1.        ],
        [0.2412842 , 0.3284408 , 1.        ],
        [0.16651034, 0.21340246, 1.        ],
        [0.11871293, 0.1490823 , 1.        ],
        [0.14595321, 0.15375502, 1.        ]],

       [[0.09512819, 0.38832542, 1.        ],
        [0.09459443, 0.27689886, 1.        ],
        [0.17869371, 0.27547523, 1.        ],
        [0.17883795, 0.23499367, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.10445232, 0.3465343 , 1.        ],
        [0.24168931, 0.34423023, 1.        ],
        [0.24163823, 0.25107542, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       ...,

       [[0.30089164, 0.71580726, 1.        ],
        [0.5002526 , 0.5892247 , 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.18035002, 0.48152474, 1.        ],
        [0.30827788, 0.35945153, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.09277791, 0.96487546, 1.        ],
        [0.18905653, 0.88573194, 1.        ],
        [0.19212121, 0.2603276 , 1.        ],
        [0.24991944, 0.2679152 , 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]]], dtype=float32)], array([[1.98533404, 1.93076146, 1.9152559 , ..., 0.11101302, 0.11558606,
        0.20735238],
       [1.99999988, 1.92264879, 1.83892071, ..., 0.10360385, 0.1098687 ,
        0.20642627],
       [1.96851051, 1.94098198, 1.80985665, ..., 0.11520069, 0.12520355,
        0.22058663],
       ...,
       [2.        , 1.85604   , 1.71518326, ..., 0.15669558, 0.17813494,
        0.21539345],
       [1.96859002, 1.82195926, 1.79924774, ..., 0.13711962, 0.13842368,
        0.2209783 ],
       [1.99999988, 1.96388233, 1.89633679, ..., 0.13371817, 0.20691521,
        0.22239145]])]

error when running tutorialPersLay.ipynb

Hi, @MathieuCarriere ,

I got the following error when I execute the following line:
diags_tmp, feats, labels = load_diagfeatlabels(dataset)
in notebook tutorialPersLay.ipynb .

Any hints to solve this issue?
THX!

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-12-6f113c568970> in <module>()
----> 1 diags_tmp, feats, labels = load_diagfeatlabels(dataset)

~/anaconda3/envs/gudhi/lib/python3.6/site-packages/perslay/experiments.py in load_diagfeatlabels(dataset, path_dataset, filtrations, verbose)
    287     # Extract and encode labels with integers
    288     L = np.array(LabelEncoder().fit_transform(np.array(feat["label"])))
--> 289     L = OneHotEncoder(sparse=False, categories="auto").fit_transform(L[:, np.newaxis])
    290 
    291     # Extract features

TypeError: __init__() got an unexpected keyword argument 'categories'

during the training in tutorialPersLay.ipynb

Hi, @MathieuCarriere ,

When I try your updated version (tf2.0) of perslay, I got the following error during the training in tutorialPersLay.ipynb.
The cell is this one:
_, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

The error is the following. Could you give some hints to solve this issue? Thanks !

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-59-7dbdefe827e4> in <module>
----> 1 _, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

/media/root/mdata/data/code13/perslay/tutorial/experiments.py in evaluate_model(L, F, D, train_sub, test_sub, model, optimizer, loss, metrics, num_epochs, batch_size, verbose, plots)
    380 
    381     model.compile(loss=loss, optimizer=optimizer, metrics=metrics)
--> 382     history = model.fit(x=[diags_train, feats_train], y=label_train, validation_data=([diags_test, feats_test], label_test), epochs=num_epochs, batch_size=batch_size, shuffle=True, verbose=verbose)
    383     train_results = model.evaluate([diags_train, feats_train], label_train, verbose=verbose)
    384     test_results = model.evaluate([diags_test,  feats_test],  label_test, verbose=verbose)

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_freq, max_queue_size, workers, use_multiprocessing, **kwargs)
    726         max_queue_size=max_queue_size,
    727         workers=workers,
--> 728         use_multiprocessing=use_multiprocessing)
    729 
    730   def evaluate(self,

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_v2.py in fit(self, model, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_freq, **kwargs)
    222           validation_data=validation_data,
    223           validation_steps=validation_steps,
--> 224           distribution_strategy=strategy)
    225 
    226       total_samples = _get_total_number_of_samples(training_data_adapter)

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_v2.py in _process_training_inputs(model, x, y, batch_size, epochs, sample_weights, class_weights, steps_per_epoch, validation_split, validation_data, validation_steps, shuffle, distribution_strategy, max_queue_size, workers, use_multiprocessing)
    545         max_queue_size=max_queue_size,
    546         workers=workers,
--> 547         use_multiprocessing=use_multiprocessing)
    548     val_adapter = None
    549     if validation_data:

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_v2.py in _process_inputs(model, x, y, batch_size, epochs, sample_weights, class_weights, shuffle, steps, distribution_strategy, max_queue_size, workers, use_multiprocessing)
    592         batch_size=batch_size,
    593         check_steps=False,
--> 594         steps=steps)
    595   adapter = adapter_cls(
    596       x,

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in _standardize_user_data(self, x, y, sample_weight, class_weight, batch_size, check_steps, steps_name, steps, validation_split, shuffle, extract_tensors_from_dataset)
   2417     # First, we build the model on the fly if necessary.
   2418     if not self.inputs:
-> 2419       all_inputs, y_input, dict_inputs = self._build_model_with_inputs(x, y)
   2420       is_build_called = True
   2421     else:

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training.py in _build_model_with_inputs(self, inputs, targets)
   2580     # or lists of arrays, and extract a flat list of inputs from the passed
   2581     # structure.
-> 2582     training_utils.validate_input_types(inputs, orig_inputs)
   2583 
   2584     if isinstance(inputs, (list, tuple)):

~/anaconda3/envs/tf2.0/lib/python3.6/site-packages/tensorflow_core/python/keras/engine/training_utils.py in validate_input_types(inp, orig_inp, allow_dict, field_name)
   1141       raise ValueError(
   1142           'Please provide as model inputs either a single array or a list of '
-> 1143           'arrays. You passed: {}={}'.format(field_name, str(orig_inp)))
   1144   elif isinstance(inp, dict):
   1145     if not allow_dict:

ValueError: Please provide as model inputs either a single array or a list of arrays. You passed: inputs=[[array([[[0.04096222, 0.6492617 , 1.        ]],

       [[0.02979578, 0.6456197 , 1.        ]],

       [[0.10624982, 0.70130676, 1.        ]],

       [[0.09423445, 0.68579924, 1.        ]],

       [[0.08765721, 0.65461326, 1.        ]],

       [[0.07269918, 0.6840143 , 1.        ]],

       [[0.0482158 , 0.6551768 , 1.        ]],

       [[0.16572776, 0.7184999 , 1.        ]],

       [[0.13522379, 0.7022206 , 1.        ]],

       [[0.09423427, 0.6857996 , 1.        ]],

       [[0.08149477, 0.6545724 , 1.        ]],

       [[0.04433941, 0.6531972 , 1.        ]],

       [[0.09321079, 0.69957846, 1.        ]],

       [[0.13965023, 0.7184939 , 1.        ]],

       [[0.07372162, 0.6409182 , 1.        ]],

       [[0.04427183, 0.70080274, 1.        ]],

       [[0.10328357, 0.6514773 , 1.        ]],

       [[0.05908684, 0.7028342 , 1.        ]],

       [[0.13522449, 0.7022229 , 1.        ]],

       [[0.04618937, 0.68428135, 1.        ]],

       [[0.08930153, 0.6993657 , 1.        ]],

       [[0.1515532 , 0.6633823 , 1.        ]],

       [[0.16572818, 0.7185017 , 1.        ]],

       [[0.09329209, 0.7022637 , 1.        ]],

       [[0.04639818, 0.6843244 , 1.        ]],

       [[0.06401373, 0.64134747, 1.        ]],

       [[0.09095395, 0.6480904 , 1.        ]],

       [[0.10625249, 0.70130414, 1.        ]],

       [[0.12966095, 0.6479485 , 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.10483469, 0.66629255, 1.        ]],

       [[0.16577075, 0.71886766, 1.        ]],

       [[0.09266786, 0.70124847, 1.        ]],

       [[0.10512511, 0.7184938 , 1.        ]],

       [[0.07416126, 0.6843585 , 1.        ]],

       [[0.10509352, 0.70854795, 1.        ]],

       [[0.00336864, 0.69990474, 1.        ]],

       [[0.04863785, 0.66084623, 1.        ]],

       [[0.0402654 , 0.66950077, 1.        ]],

       [[0.00314568, 0.6446781 , 1.        ]],

       [[0.0768472 , 0.7088008 , 1.        ]],

       [[0.15155277, 0.66338074, 1.        ]],

       [[0.0768472 , 0.7088008 , 1.        ]],

       [[0.03819996, 0.69101375, 1.        ]],

       [[0.05124237, 0.64191353, 1.        ]],

       [[0.14820483, 0.70284945, 1.        ]],

       [[0.05176072, 0.7186711 , 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.04863784, 0.6840446 , 1.        ]],

       [[0.06863144, 0.6390971 , 1.        ]],

       [[0.157291  , 0.700788  , 1.        ]],

       [[0.06335098, 0.6808842 , 1.        ]],

       [[0.06218416, 0.6615998 , 1.        ]],

       [[0.05393252, 0.65079397, 1.        ]],

       [[0.15131031, 0.6618089 , 1.        ]],

       [[0.02778925, 0.6551692 , 1.        ]],

       [[0.07684776, 0.70164394, 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.11811654, 0.7184362 , 1.        ]],

       [[0.09408505, 0.6857985 , 1.        ]],

       [[0.09341744, 0.70130247, 1.        ]],

       [[0.09408478, 0.68579805, 1.        ]],

       [[0.05208734, 0.70160836, 1.        ]],

       [[0.04816426, 0.65515417, 1.        ]],

       [[0.0494085 , 0.6470727 , 1.        ]],

       [[0.09179149, 0.877014  , 1.        ]],

       [[0.09347136, 0.6552795 , 1.        ]],

       [[0.16576958, 0.71886665, 1.        ]],

       [[0.13413991, 0.65614253, 1.        ]],

       [[0.15326568, 0.71949303, 1.        ]],

       [[0.15155329, 0.6633824 , 1.        ]],

       [[0.00562564, 0.6999543 , 1.        ]],

       [[0.08739629, 0.6546074 , 1.        ]],

       [[0.06784848, 0.7016517 , 1.        ]],

       [[0.1659189 , 0.71990293, 1.        ]],

       [[0.06543359, 1.        , 1.        ]],

       [[0.13146053, 0.70120275, 1.        ]],

       [[0.05716776, 0.6871406 , 1.        ]],

       [[0.0571343 , 0.6839328 , 1.        ]],

       [[0.        , 0.6381614 , 1.        ]],

       [[0.09843671, 0.64788634, 1.        ]],

       [[0.07215892, 0.6551485 , 1.        ]],

       [[0.07627445, 0.66097206, 1.        ]],

       [[0.14998108, 0.6557486 , 1.        ]],

       [[0.09843668, 0.6478921 , 1.        ]],

       [[0.0460642 , 0.63841474, 1.        ]],

       [[0.07177885, 0.6551704 , 1.        ]],

       [[0.08518489, 0.6551498 , 1.        ]],

       [[0.11245405, 0.70022535, 1.        ]],

       [[0.08245772, 0.7016489 , 1.        ]],

       [[0.11314984, 0.7002235 , 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.13016728, 0.6568459 , 1.        ]],

       [[0.11789479, 0.72237647, 1.        ]],

       [[0.153811  , 0.6747876 , 1.        ]],

       [[0.13413991, 0.65614253, 1.        ]],

       [[0.09341697, 0.7012985 , 1.        ]],

       [[0.0660955 , 0.7184495 , 1.        ]],

       [[0.07365225, 0.6551383 , 1.        ]],

       [[0.08253633, 0.6435734 , 1.        ]],

       [[0.10481098, 0.7003628 , 1.        ]],

       [[0.03335524, 0.6551699 , 1.        ]],

       [[0.06791646, 0.6405793 , 1.        ]],

       [[0.06609582, 0.71844697, 1.        ]],

       [[0.05173067, 0.71949786, 1.        ]],

       [[0.07379147, 0.69480264, 1.        ]],

       [[0.11711456, 0.70855874, 1.        ]],

       [[0.10172272, 0.7237017 , 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.07358661, 0.6408898 , 1.        ]],

       [[0.1482031 , 0.7028453 , 1.        ]],

       [[0.04816353, 0.6551557 , 1.        ]],

       [[0.09356514, 0.71849257, 1.        ]],

       [[0.06335098, 0.6808842 , 1.        ]],

       [[0.09269566, 0.7085407 , 1.        ]],

       [[0.15451556, 0.6787512 , 1.        ]],

       [[0.09674594, 0.6551816 , 1.        ]],

       [[0.00115718, 0.6836316 , 1.        ]],

       [[0.09423445, 0.68579924, 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.09084272, 0.70164406, 1.        ]],

       [[0.10474764, 0.685521  , 1.        ]],

       [[0.0703817 , 0.70198435, 1.        ]],

       [[0.09000266, 0.6631299 , 1.        ]],

       [[0.12282047, 0.6486707 , 1.        ]],

       [[0.15451601, 0.6787521 , 1.        ]],

       [[0.09719884, 0.6849833 , 1.        ]],

       [[0.11712634, 0.7184987 , 1.        ]],

       [[0.06335098, 0.6808842 , 1.        ]],

       [[0.14820498, 0.7028472 , 1.        ]],

       [[0.11721653, 0.7084044 , 1.        ]]], dtype=float32), array([[[0.04530944, 0.4153706 , 1.        ],
        [0.10001399, 0.39944366, 1.        ],
        [0.        , 0.21005177, 1.        ],
        ...,
        [0.31145117, 0.44218075, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14051671, 0.4857567 , 1.        ],
        [0.09360856, 0.35311696, 1.        ],
        [0.11756343, 0.35479027, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.13374329, 0.57005185, 1.        ],
        [0.11520837, 0.4339289 , 1.        ],
        [0.283554  , 0.43663704, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       ...,

       [[0.30671087, 0.87066615, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.18924364, 0.59529954, 1.        ],
        [0.37977156, 0.59529954, 1.        ],
        [0.        , 0.        , 0.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.1641429 , 0.6290723 , 1.        ],
        [0.13456053, 0.56243235, 1.        ],
        [0.        , 0.        , 0.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]]], dtype=float32), array([[[0.13427994, 0.64329606, 1.        ],
        [0.13427994, 0.64329606, 1.        ],
        [0.08229115, 0.21664964, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.13341546, 0.63959205, 1.        ],
        [0.13341546, 0.63959205, 1.        ],
        [0.0785995 , 0.30460635, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14788273, 0.6962263 , 1.        ],
        [0.14788273, 0.6962263 , 1.        ],
        [0.10173036, 0.21163371, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       ...,

       [[0.14069043, 0.6754564 , 1.        ],
        [0.14069055, 0.6754564 , 1.        ],
        [0.08124373, 0.52644134, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14809799, 0.69779295, 1.        ],
        [0.14809805, 0.69779295, 1.        ],
        [0.1637246 , 0.36651218, 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.14917274, 0.7034447 , 1.        ],
        [0.14917292, 0.7034447 , 1.        ],
        [0.14768288, 0.6951592 , 1.        ],
        ...,
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]]], dtype=float32), array([[[0.0308405 , 0.35125092, 1.        ],
        [0.1389144 , 0.25848013, 1.        ],
        [0.2412842 , 0.3284408 , 1.        ],
        [0.16651034, 0.21340246, 1.        ],
        [0.11871293, 0.1490823 , 1.        ],
        [0.14595321, 0.15375502, 1.        ]],

       [[0.09512819, 0.38832542, 1.        ],
        [0.09459443, 0.27689886, 1.        ],
        [0.17869371, 0.27547523, 1.        ],
        [0.17883795, 0.23499367, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.10445232, 0.3465343 , 1.        ],
        [0.24168931, 0.34423023, 1.        ],
        [0.24163823, 0.25107542, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       ...,

       [[0.30089164, 0.71580726, 1.        ],
        [0.5002526 , 0.5892247 , 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.18035002, 0.48152474, 1.        ],
        [0.30827788, 0.35945153, 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]],

       [[0.09277791, 0.96487546, 1.        ],
        [0.18905653, 0.88573194, 1.        ],
        [0.19212121, 0.2603276 , 1.        ],
        [0.24991944, 0.2679152 , 1.        ],
        [0.        , 0.        , 0.        ],
        [0.        , 0.        , 0.        ]]], dtype=float32)], array([[1.98533404, 1.93076146, 1.9152559 , ..., 0.11101302, 0.11558606,
        0.20735238],
       [1.99999988, 1.92264879, 1.83892071, ..., 0.10360385, 0.1098687 ,
        0.20642627],
       [1.96851051, 1.94098198, 1.80985665, ..., 0.11520069, 0.12520355,
        0.22058663],
       ...,
       [2.        , 1.85604   , 1.71518326, ..., 0.15669558, 0.17813494,
        0.21539345],
       [1.96859002, 1.82195926, 1.79924774, ..., 0.13711962, 0.13842368,
        0.2209783 ],
       [1.99999988, 1.96388233, 1.89633679, ..., 0.13371817, 0.20691521,
        0.22239145]])]

Error when running tutorialPersLay.ipynb

Hi Mathieu,

I am able to run test.py but when I tried to run the tutorial. I had the following error. Do I need to update gudhi(installed from conda) to get it work?

Thanks!
Chen

import gudhi
print(gudhi.version) # 2.1.0
generate(dataset)


AttributeError Traceback (most recent call last)
in ()
1 import gudhi
2 print(gudhi.version)
----> 3 generate(dataset)

~/Documents/osu/Research/perslay/utils.py in generate(dataset)
242 hks_time = float(filtration.split("-")[0])
243 filtration_val = hks_signature(egvectors, egvals, time=hks_time)
--> 244 dgmOrd0, dgmExt0, dgmRel1, dgmExt1 = apply_graph_extended_persistence(A, filtration_val, basesimplex)
245 diag_file["Ord0
" + filtration].create_dataset(name=str(gid), data=dgmOrd0)
246 diag_file["Ext0
" + filtration].create_dataset(name=str(gid), data=dgmExt0)

~/Documents/osu/Research/perslay/utils.py in _apply_graph_extended_persistence(A, filtration_val, basesimplex)
139 st.assign_filtration(simplex=[vid, -2], filtration=min(fd[vid, np.where(A[vid, :] > 0)[0]]))
140
--> 141 st.make_filtration_non_decreasing()
142 distorted_dgm = st.persistence()
143 normal_dgm = dict()

AttributeError: 'gudhi.SimplexTree' object has no attribute 'make_filtration_non_decreasing'

Issue with RationalHat layer, local variable 'LVinit' referenced before assignment

When I use the layer "RationalHat", I get the following error:

UnboundLocalError Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_33568/2646110522.py in
65 rho = tf.keras.Sequential([tf.keras.layers.Dense(1, activation="sigmoid")])
66 model = None
---> 67 model = PerslayModel(name="PersLay", diagdim=2, perslay_parameters=perslay_parameters, rho=rho)
68 model.compile(loss='binary_crossentropy',optimizer=optimizer,metrics=['accuracy'])
69 print(optimizer,perm_op,pweight,layer)

~\anaconda3\envs\downgrade\lib\site-packages\perslay\perslay.py in init(self, name, diagdim, perslay_parameters, rho)
159 LMinit, LRinit = plp["lmean_init"], plp["lr_init"]
160 LMiv = LMinit if not callable(LMinit) else LMinit([self.diagdim, plp["lnum"]])
--> 161 LRiv = LRinit if not callable(LRinit) else LVinit([1])
162 LM = tf.Variable(name=Lname+"-M", initial_value=LMiv, trainable=Ltrain)
163 LR = tf.Variable(name=Lname+"-R", initial_value=LRiv, trainable=Ltrain)

UnboundLocalError: local variable 'LVinit' referenced before assignment

When I checked the perslay.py file, I found :

elif layer == "RationalHat":
    LMinit, LRinit = plp["lmean_init"], plp["lr_init"]
    LMiv = LMinit if not callable(LMinit) else LMinit([self.diagdim, plp["lnum"]])
    LRiv = LRinit if not callable(LRinit) else LVinit([1])
    LM = tf.Variable(name=Lname+"-M", initial_value=LMiv, trainable=Ltrain)
    LR = tf.Variable(name=Lname+"-R", initial_value=LRiv, trainable=Ltrain)
    self.vars[nf].append([LM, LR])

Where LVinit is used before being defined, unlike the other layers where its defined at the beginning (after elif).

I dont know if I am not using the layer correctly (for example if I had to define an other layer before it). But I tried the same code with all the other layers (except the image layer) and it worked without an error.

POT library is not listed at dependencies

~/Documents/perslay2/expe/utils.py in <module>
     21 import h5py
     22 
---> 23 from ot.bregman import sinkhorn
     24 
     25 from sklearn.preprocessing import LabelEncoder, OneHotEncoder

ModuleNotFoundError: No module named 'ot'

'generate_diagrams_and_features' is not defined

Hello,

While running the tutorial, I obtain the following error: 'generate_diagrams_and_features' is not defined.
Matter of fact is that my gudhi and tensorflow are well installed, and I don't find any other problems with this 'generate_diagrams_and_features' concerning the isssues that are already solved. Is it due to a new update or am I doing something wrong ?
Thank you very much for any kind of help.

There is a problem when training the network. Please support.

Hi @MathieuCarriere

There is still a problem when I try to run the code to train the network. Please help me. Thank you very much.

To be more specific, I have a problem with this code line. I attached here the training error.

Code line:
_, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)

Trainning error:
INFO:tensorflow:Error reported to Coordinator: in user code:

/content/perslay.py:258 call  *
    final_representations = self.rho(concat_representations) if self.rho != "identity" else concat_representations
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py:1020 __call__  **
    input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
/usr/local/lib/python3.7/dist-packages/keras/engine/input_spec.py:254 assert_input_compatibility
    ' but received input with shape ' + display_shape(x.shape))

ValueError: Input 0 of layer sequential_5 is incompatible with the layer: expected axis -1 of input shape to have value 16039 but received input with shape (None, 439)

Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/training/coordinator.py", line 297, in stop_on_exception
yield
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/distribute/mirrored_run.py", line 346, in run
self.main_result = self.main_fn(*self.main_args, **self.main_kwargs)
File "/usr/local/lib/python3.7/dist-packages/tensorflow/python/autograph/impl/api.py", line 695, in wrapper
raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:

/content/perslay.py:258 call  *
    final_representations = self.rho(concat_representations) if self.rho != "identity" else concat_representations
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py:1020 __call__  **
    input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
/usr/local/lib/python3.7/dist-packages/keras/engine/input_spec.py:254 assert_input_compatibility
    ' but received input with shape ' + display_shape(x.shape))

ValueError: Input 0 of layer sequential_5 is incompatible with the layer: expected axis -1 of input shape to have value 16039 but received input with shape (None, 439)

ValueError Traceback (most recent call last)
in ()
----> 1 _, tr, te = evaluate_model(L,F,D,train,test,model,optimizer,loss,metrics,num_epochs=epochs,verbose=0,plots=True)


10 frames


/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/func_graph.py in wrapper(*args, **kwargs)
992 except Exception as e: # pylint:disable=broad-except
993 if hasattr(e, "ag_error_metadata"):
--> 994 raise e.ag_error_metadata.to_exception(e)
995 else:
996 raise

ValueError: in user code:

/usr/local/lib/python3.7/dist-packages/keras/engine/training.py:853 train_function  *
    return step_function(self, iterator)
/content/perslay.py:258 call  *
    final_representations = self.rho(concat_representations) if self.rho != "identity" else concat_representations
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py:1020 __call__  **
    input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
/usr/local/lib/python3.7/dist-packages/keras/engine/input_spec.py:254 assert_input_compatibility
    ' but received input with shape ' + display_shape(x.shape))

ValueError: Input 0 of layer sequential_5 is incompatible with the layer: expected axis -1 of input shape to have value 16039 but received input with shape (None, 439)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.