Giter Club home page Giter Club logo

foglamp's Introduction

FogLAMP

This is the FogLAMP project.

The open source version of FogLAMP has moved to become part of the Linux Foundation LFedge Project

Within the Linux Foundation FogLAMP is now known as .

  • Fledge is available in GitHub as the project.
  • The latest version of FogLAMP is still available from Dianomic and has all the latest Fledge features plus extra plugin functionality.
  • Your may or packaged binaries from the Dianomic website. These include all available plugins for your chosen architecture.
  • Documentation is available on readthedocs for both and and includes details of the plugins available for each.

This repository is now archieved.

FogLAMP is an open source platform for the Internet of Things, and an essential component in Fog Computing. It uses a modular microservices architecture including sensor data collection, storage, processing and forwarding to historians, Enterprise systems and Cloud-based services. FogLAMP can run in highly available, stand alone, unattended environments that assume unreliable network connectivity.

FogLAMP also provides a means of buffering data coming from sensors and forwarding that data onto high level storage systems. It assumes the underlying network layer is not always connected or may not be reliable. Data from sensors may be stored within FogLAMP for a number of days before being purged from the FogLAMP storage. During this time it may be sent to one or more historians and also accessed via a REST API for use by local analytical applications.

FogLAMP has been designed to run in a Linux environment and makes use of Linux services.

Architecture

FogLAMP is built using a microservices architecture for major component areas, these services consist of:

  • a Core service responsible for the management of the other services, the external REST API's, scheduling and monitoring of activities.
  • a South service responsible for the communication between FogLAMP and the sensors/actuators.
  • a Storage service responsible for the persistance of configuration and metrics and the buffering of sensor data.

FogLAMP makes extensive use of plugin components in order to increase the flexibility of the implementation:

  • South plugins are used to allow for the easy expansion of FogLAMP to deal with new South devices and South device connection buses.
  • North plugins are used to allow for connection to different historians
  • Datastore plugins are used to allow FogLAMP to use different storage mechanisms for persisting meta data and the sensor data
  • Authentication provider plugins are used to allow the authentication mechanism to be matched with enterprise requirements or provided internally by FogLAMP.

The other paradigm that is used extensively within FogLAMP is the idea of scheduling processes to perform specific operations. The FogLAMP core contains a scheduler which can execute processes based on time schedules or triggered by events. This is used to start processes when an event occurs, such as FogLAMP starting, or based on a time trigger.

Scheduled processes are used to send data from FogLAMP to the historian, to purge data from the FogLAMP data buffer, to gather statistics for historical analysis and perform backups of the FogLAMP environment.

Building FogLAMP

Build Prerequisites

FogLAMP is currently based on C/C++ and Python code. The packages needed to build and run FogLAMP are:

  • autoconf
  • automake
  • avahi-daemon
  • build-essential
  • cmake
  • curl
  • g++
  • libtool
  • libboost-dev
  • libboost-system-dev
  • libboost-thread-dev
  • libpq-dev
  • libssl-dev
  • libz-dev
  • make
  • postgresql
  • python3-pip
  • python-dev
  • python3-dev
  • uuid-dev
  • sqlite3
  • libsqlite3-dev

Linux distributions

FogLAMP can be built or installed in one of the following Linux distributions :

  • Ubuntu 16.04 and Ubuntu 18.04
  • Raspbian Stretch and Buster
  • Red Hat 7.6
  • CentOS 7.6
  • Coral Mendel

Install the prerequisites on Ubuntu

On Ubuntu-based Linux distributions the packages can be installed with given requirements.sh or manual apt-get: :: apt-get install avahi-daemon curl apt-get install cmake g++ make build-essential autoconf automake uuid-dev apt-get install libtool libboost-dev libboost-system-dev libboost-thread-dev libpq-dev libssl-dev libz-dev apt-get install python-dev python3-dev python3-pip apt-get install postgresql apt-get install sqlite3 libsqlite3-dev

You may need to use sudo to allow apt-get to install packages dependent upon your access rights.

Install the prerequisites on Red Hat/CentOS

On Red Hat and CentOS distributions the required packages can be installed automatically with given requirements.sh: :: sudo ./requirements.sh

You should run this as a user with sudo access rights.

Build

To build FogLAMP run the command make in the top level directory. This will compile all the components that need to be compiled and will also create a runable structure of the Python code components of FogLAMP.

NOTE:

  • The GCC compiler version 5.4 available in Ubuntu 16.04 LTS raises warnings. This is a known bug of the compiler and it can be ignored.
  • openssl toolkit is a requirement if we want to use https based REST client and certificate based authentication.

Once the make has completed you can decide to test FogLAMP from your development environment or you can install it.

Testing FogLAMP from Your Development Environment

you can test FogLAMP directly from your Development Environment. All you need to do is to set one environment variable to be able to run FogLAMP from the development tree. :: export FOGLAMP_ROOT=<basedir>/FogLAMP

Where basedir is the base directory into which you cloned the FogLAMP repository.

Finally, start the FogLAMP core daemon: :: $FOGLAMP_ROOT/scripts/foglamp start

Installing FogLAMP

Create an installation by executing make install, then set the FOGLAMP_ROOT environment variable specifying the installation path. By default the installation will be placed in /usr/local/foglamp. You may need to execute sudo make install to install FogLAMP where the current user does not have permissions: :: sudo make install export FOGLAMP_ROOT=/usr/local/foglamp

The destination may be overriden by setting the variable DESTDIR in the make command line, to a location in which you wish to install FogLAMP. For example, to install FogLAMP in the /opt directory use the command: :: sudo make install DESTDIR=/opt export FOGLAMP_ROOT=/opt/usr/local/foglamp

Upgrading FogLAMP on Debian based systems

FogLAMP supports the Kerberos authentication starting from the version 1.7.1 and so the related packages are installed by the script requirements.sh. The krb5-user package prompt a question during the installation process asking for the KDC definition, the packages are installed setting the environment DEBIAN_FRONTEND to avoid this interaction: :

# for Kerberos authentication, avoid interactive questions
DEBIAN_FRONTEND=noninteractive apt install -yq krb5-user
apt install -y libcurl4-openssl-dev

The upgrade of the FogLAMP package should follow the same philosophy, it should be done executing the command: :: sudo DEBIAN_FRONTEND=noninteractive apt -y upgrade

before the upgrade of FogLAMP, SETENV: should be set/added in /etc/sudoers.d/foglamp to allow sudo to support the handling of the environment variables, a sample of the file: :

%sudo ALL=(ALL) NOPASSWD:SETENV: /usr/bin/apt -y update, /usr/bin/apt-get -y install foglamp, /usr/bin/apt -y install /usr/local/foglamp/data/plugins/foglamp*.deb, /usr/bin/apt list, /usr/bin/apt -y install foglamp*, /usr/bin/apt -y upgrade

Executing FogLAMP

FogLAMP is now ready to start. Use the command: :: $FOGLAMP_ROOT/bin/foglamp start

To check if FogLAMP is running, use the command: :: $FOGLAMP_ROOT/bin/foglamp status

The command returns the status of FogLAMP on the machine it has been executed.

If You Use PostgreSQL: Creating the Database Repository

This version of FogLAMP relies on SQLite to run. SQLite is embedded into the Storage service, but you may want to use PostgreSQL as a buffer and metadata storage (refer to the documentation on ReadTheDocs for more info. With a version of PostgreSQL installed via apt-get first you need to create a new database user with: :: sudo -u postgres createuser -d <user>

where user is the name of the Linux user that will run FogLAMP. The FogLAMP database user must have createdb privileges (i.e. the -d argument).

Troubleshooting

FogLAMP version 1.7.0

$FOGLAMP_ROOT/data/etc directory ownership

The execution of the sudo make install immediately after git clone will create a data/etc directory owned by the root user, it should be owned by the user that will run FogLAMP, to fix it: :: chown -R <user>:<user> $FOGLAMP_ROOT/data

where user is the name of the Linux user that will run FogLAMP.

foglamp's People

Contributors

aksinha-nerdapplabs avatar amandeeparora avatar amarendra-dianomic avatar ashish-jabble avatar ashwinscale avatar billehunt avatar danielelopez avatar dianomicbot avatar izoratti avatar izoratti-dianomic avatar izzypizzy65 avatar m0fff avatar markriddoch avatar michaelfitz avatar miguelitoraton avatar monikasharma06 avatar mshariq-nerd avatar mshariqaftab avatar oshadmon avatar pintomax avatar praveen-garg avatar rajeshkumar426 avatar sdauber avatar singhal-vaibhav avatar terris-dianomic avatar thyagr avatar yashtatkondawar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

foglamp's Issues

Error while sending OMF messages from FogLAMP to Pi Connector Relay/Pi Server

Hi,
I have enabled schedule for sending the FogLAMP readings to North (Pi Server) and have set Producer Token and Ingress URL as shown in Pi Data Collection Manager as follows:

curl -s http://localhost:8081/foglamp/schedule/2b614d26-760f-11e7-b5a5-be2e44b06b34 | jq
{
"day": null,
"id": "2b614d26-760f-11e7-b5a5-be2e44b06b34",
"enabled": true,
"time": 0,
"repeat": 30,
"type": "INTERVAL",
"exclusive": true,
"processName": "North Readings to PI",
"name": "OMF to PI north"
}

curl -X GET http://localhost:8081/foglamp/category/SEND_PR_1 | jq
{
"applyFilter": {
.......

"URL": {
"type": "string",
"value": "https://ibmx3530m3-52-20:5460/ingress/messages",
"default": "https://pi-server:5460/ingress/messages",
"description": "The URL of the PI Connector to send data to"
},

"producerToken": {
"type": "string",
"value": "uid=3d2e6c31-746e-4bd9-8ada-cc8e286b441a&crt=20180716235212830&sig=6asBQ1ceQZAMsHPOmdbZb0stGz2Paye4KEUzuXN3/eQ=",
"default": "omf_north_0001",
"description": "The producer token that represents this FogLAMP stream"
},
.....

}
Yet, I do not see anything on Pi Asset Framework Ingested.
I looked up the /var/log/syslog and found these error messages - please help me resolve the problem.

Error Message from syslog:
WARNING: omf: omf_north_1: an error occurred during the request to the destination - error details |HTTPSConnectionPool(host='ibmx3530m3-52-20', port=5460): Max retries exceeded with url: /ingress/messages (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7ff276fe33c8>: Failed to establish a new connection: [Errno -2] Name or service not known',))|
Jul 16 13:11:52 scspa0501709001 FogLAMP[31799] ERROR: omf: omf_north_1: cannot complete the sending operation - error details |an error occurred during the request to the destination - error details |HTTPSConnectionPool(host='ibmx3530m3-52-20', port=5460): Max retries exceeded with url: /ingress/messages (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7ff276fe33c8>: Failed to establish a new connection: [Errno -2] Name or service not known',))||#012Traceback (most recent call last):#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 396, in plugin_send#012 omf_north.create_omf_objects(raw_data, config_category_name, type_id)#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 221, in wrapper#012 result = _function(*arg)#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 693, in create_omf_objects#012 self._create_omf_objects_automatic(item)#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 541, in _create_omf_objects_automatic#012 typename, omf_type = self._create_omf_type_automatic(asset_info)#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 581, in _create_omf_type_automatic#012 self.send_in_memory_data_to_picromf("Type", omf_type[typename])#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 221, in wrapper#012 result = _function(*arg)#12 File "/usr/local/foglamp/python/foglamp/plugins/north/omf/omf.py", line 754, in send_in_memory_data_to_picromf#012 raise _error#012Exception: an error occurred during the request to the destination - error details |HTTPSConnectionPool(host='ibmx3530m3-52-20', port=5460): Max retries exceeded with url: /ingress/messages (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7ff276fe33c8>: Failed to establish a new connection: [Errno -2] Name or service not known',))|
Jul 16 13:11:52 scspa0501709001 FogLAMP[31799] ERROR: sending_process: sending_process_1: cannot complete the sending operation of a block of data.
Jul 16 13:11:52 scspa0501709001 FogLAMP[31799] ERROR: sending_process: sending_process_1: cannot complete the sending operation - error details |an error occurred during the request to the destination - error details |HTTPSConnectionPool(host='ibmx3530m3-52-20', port=5460): Max retries exceeded with url: /ingress/messages (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7ff276fe33c8>: Failed to establish a new connection: [Errno -2] Name or service not known',))||

Testing the OMF Translator plugin against an HTTPS OMF endpoint

I see in the http://foglamp.readthedocs.io/en/master/05_testing.html#setting-the-omf-translator-plugin documentation that the example Connector Relay endpoint is HTTP, but I'd like to test against an HTTPS endpoint (using the new version of the Connector Relay that natively supports HTTPS). That endpoint, though, uses a self-signed certificate, so I imagine that certificate validation would fail. When I try to run the test outlined in the documentation, using the new HTTPS endpoint URL for my PI Connector Relay, I don't see any new AF Elements being created. Is there a place where I can check for any OMF-related errors?

Time units are misleading in warning message

When a reading violates the latency limit, a warning message is posted to the log. This message states the latency and the desired latency. Actual latency is in seconds, and desired is in milliseconds, but the message compares them and is stated in seconds.

Example:

Oct 21 10:42:03 XXXX[1145]: WARNING: Current send latency of XXXXX seconds exceeds requested maximum latency of 5000 seconds

Screenshot from 2019-10-21 10-43-02

json.loads

readings_list = [json.loads(line) for line in f]

I do not know how the .loads works but I suspect it loads the array in memory, which should be avoided.

By removing the line below (pprint), there is a performance boost.

FogLAMP facing problem while sending data to OSISoft

I have setup FogLAMP and also the URL and producerToken to send the data.
There were some problems initially with respect to the security settings of the Pi Points and I set correct Security Settings. Using Data Collection Manager I have created the Pipeline.
However, I might be missing something on the AF setup because whenever I ingest data into FogLAMP and it goes through the north Plugin (OMF) - on FogLAMP the SENT stats increase, but on the OSISoft (DCM) I am seeing this error:
Do I need to create some template - what should be the format, etc.?
NOTE: I am only interested in the data getting ingested into Pi Archive from FogLAMP OMF - but because of following errors Pi Points are not getting created.

Error while checking in the pending changes: Cannot complete the operation because the user does not have rights to write to ElementTemplate 'OMF.FogLAMP Connector.0001_D_CAB/A_typename_sensor' with UniqueID 'bd2be362-4922-4e3a-83cc-baa402d43d77'. AF assets will be checked-in when a new value is sent. Full exception: OSIsoft.Tau.AssetCache.AFCheckInException: Error while checking in the pending changes: Cannot complete the operation because the user does not have rights to write to ElementTemplate 'OMF.FogLAMP Connector.0001_D_CAB/A_typename_sensor' with UniqueID 'bd2be362-4922-4e3a-83cc-baa402d43d77'. ---> System.Security.SecurityException: Cannot complete the operation because the user does not have rights to write to ElementTemplate 'OMF.FogLAMP Connector.0001_D_CAB/A_typename_sensor' with UniqueID 'bd2be362-4922-4e3a-83cc-baa402d43d77'. at OSIsoft.AF.PISystem.CheckServerError(dcServerError err, Boolean firstAttempt) at OSIsoft.AF.Support.AFSerialProxy.Call(String rpcName, ProxyDelegate codeBlock) at OSIsoft.AF.Support.AFSerialProxy.CheckInElementTemplate(dcElementTemplate[] changes, dcObjectIdentity[] deletedIdentities, Guid[] elementsWithAutoCreatedAnalyses, Boolean& updatedUOM, Int64[]& securityIds) at OSIsoft.AF.Asset.AFElementTemplate.RemoteCheckInElementTemplate(AFDatabase database, HashSet1 processedItems, dcElementTemplate[] changes, dcObjectIdentity[] deletedIdentities, Guid[] elementsWithAutoCreatedAnalyses) at OSIsoft.AF.AFDatabase.SaveElementTemplates(Boolean doCheckIn, HashSet1 processedItems, List1 appliedList, HashSet1 elementsWithAutoCreatedAnalysesList, List1& changeList, List1& deleteList) at OSIsoft.AF.AFDatabase.ProcessDatabaseApplyCheckInChanges(ProcessMode mode, AFCheckedOutMode checkedOutMode, IEnumerable1 objList, List1 appliedList, HashSet1 elementsWithAutoCreatedObjects, Boolean& dbPagedRefreshRequired) at OSIsoft.AF.AFDatabase.ProcessDatabaseChanges(ProcessMode mode, AFCheckedOutMode checkedOutMode, IEnumerable1 objList, List1 appliedList, HashSet1 elementsWithAutoCreatedObjects) at OSIsoft.AF.AFDatabase.ProcessDatabaseChanges(ProcessMode mode, AFCheckedOutMode checkedOutMode) at OSIsoft.AF.AFDatabase.CheckIn(AFCheckedOutMode mode) at OSIsoft.Tau.AF.Common.ConnectorDestinationMapper.CommitChanges(AFCheckedOutMode mode) at OSIsoft.Tau.AF.Common.AFContext.CommitChanges(AFCheckedOutMode mode) --- End of inner exception stack trace --- at OSIsoft.Tau.AF.Common.AFContext.CommitChanges(AFCheckedOutMode mode) at OSIsoft.Tau.PI.AssetBuilder.AFAssetBuilder.AddValueLinksToTemplate(AFElementTemplate template, Dictionary2 valueStreamLinks, String connectorType, String connectorUid) at OSIsoft.Tau.PI.AssetBuilder.AFAssetBuilder.AddStreams(AFElement elementToUpdate, Dictionary2 valueStreamLinks, String connectorType, String connectorUid) at OSIsoft.Tau.PI.AssetBuilder.AFAssetBuilder.AddStreams(AssetNode assetNode, Dictionary`2 valueStreamLinks) at OSIsoft.Tau.AssetCache.AssetCache.ProcessLink(AssetLink link) at OSIsoft.Tau.AssetCache.AssetCache.PostLink(TauEvent link, String streamId) at OSIsoft.Tau.AssetCache.AssetCache.FlushAsync(AssetCacheBatch batch)

One more error seen is:
Unable to persist entities to repository: C:\ProgramData\OSIsoft\Tau\Relay.ConnectorHost\SchemaRepository\FogLAMP/schemas.json. Exception: System.InvalidOperationException: Collection was modified; enumeration operation may not execute. at System.ThrowHelper.ThrowInvalidOperationException(ExceptionResource resource) at System.Collections.Generic.Dictionary2.Enumerator.MoveNext() at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeDictionary(JsonWriter writer, IDictionary values, JsonDictionaryContract contract, JsonProperty member, JsonContainerContract collectionContract, JsonProperty containerProperty) at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize(JsonWriter jsonWriter, Object value, Type objectType) at Newtonsoft.Json.JsonSerializer.SerializeInternal(JsonWriter jsonWriter, Object value, Type objectType) at Newtonsoft.Json.JsonConvert.SerializeObjectInternal(Object value, Type type, JsonSerializer jsonSerializer) at Newtonsoft.Json.JsonConvert.SerializeObject(Object value, JsonConverter[] converters) at OSIsoft.Tau.Relay.Translator.REST.TypeRepository.<>c__DisplayClass26_01.b__0().

Dead link in readthedocs documentation

Hello there

I just wanted to say that there is a dead link on this page which should somehow link here, but content is missing. However I think i found the related content in the github repo under this link, but i might be mistaken.

Im still evaluating the project and thanks a lot for this amazing project.

By the way, is the feature to set Applications via the REST API on any north/south plugins already implemented, and when yes how are the curl commands to set them? An example on maybe how to set a simple Application on the randomwalk or sinusoid south plugin would be highly appreciated.

I can already set it up using this curl command as explained here:

curl -sX POST http://localhost:9081/foglamp/service -d '{"name": "sinusoid", "type": "south", "plugin": "sinusoid", "enabled": true}'

Kind regards

Architecture specific questions.

Hi All,

I am posting it here as I couldn't find any community to post clarification. Please let me know if we have one.

Going by the architecture document it sounds to be a very promising tool. But there are some ambiguities couldn't be clarified either from docs or by skimming over the code.

  1. Does a south plugin mandatorily formats the sensor reading to a JSON format having keys as asset, timestamp, key & readings ? As it is done in case of DHT11i code.

  2. Can't a south plugin pass on the data directly to any of the north plugin without involving storage?
    I couldn't see any south/North plugin config which can configure this behavior. Please enligjhten me if I am missing anything evident.

  3. As per architecture document we can plug additional ML software any any third party application using Application module. But there is no description or code explaining how to do it. Is there any API exposed like we have for south and north plugin ?

http_north not showing in schedule

Hello there

First of all, a very cool project you have. I am trying to evaluate it for an IoT project in school.

My problem so far, I am trying to enable a north plugin (http_north) with the REST API and I am not finding a way of doing it in the documentation nor somewhere else.

I know south plugins are far more easy to setup with the following command:

#Adding South plugin randomwalk from scratch
curl -sX POST http://localhost:8081/foglamp/service -d '{"name": "randomwalk", "type": "south", "plugin": "randomwalk", "enabled": true}'

I do have the plugin installed so far

image

But how do I accomplish setting up the north_plugin with the REST API? I dont see a POST task or POST schedule, only PUT to update an existing entry in the schedule or task section.

Is this feature missing or am i just on a wrong track?

My best regards


Category output

curl -s http://localhost:8081/foglamp/category | jq
{
  "categories": [
    {
      "key": "SCHEDULER",
      "displayName": "Scheduler",
      "description": "Scheduler configuration"
    },
    {
      "key": "SMNTR",
      "displayName": "Service Monitor",
      "description": "Service Monitor"
    },
    {
      "key": "rest_api",
      "displayName": "Admin API",
      "description": "FogLAMP Admin and User REST API"
    },
    {
      "key": "service",
      "displayName": "FogLAMP Service",
      "description": "FogLAMP Service"
    },
    {
      "key": "General",
      "displayName": "General",
      "description": "General"
    },
    {
      "key": "Advanced",
      "displayName": "Advanced",
      "description": "Advanced"
    },
    {
      "key": "Utilities",
      "displayName": "Utilities",
      "description": "Utilities"
    },
    {
      "key": "randomwalk",
      "displayName": "randomwalk",
      "description": "Generate random walk data points"
    },
    {
      "key": "South",
      "displayName": "South",
      "description": "South microservices"
    },
    {
      "key": "randomwalkAdvanced",
      "displayName": "randomwalkAdvanced",
      "description": "randomwalk advanced config params"
    }
  ]
}

Schedule output

curl -X GET http://localhost:8081/foglamp/schedule | jq
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1394  100  1394    0     0   136k      0 --:--:-- --:--:-- --:--:--  136k
{
  "schedules": [
    {
      "time": 0,
      "exclusive": true,
      "repeat": 3600,
      "id": "cea17db8-6ccc-11e7-907b-a6006ad3dba0",
      "processName": "purge",
      "enabled": true,
      "type": "INTERVAL",
      "day": null,
      "name": "purge"
    },
    {
      "time": 0,
      "exclusive": true,
      "repeat": 15,
      "id": "2176eb68-7303-11e7-8cf7-a6006ad3dba0",
      "processName": "stats collector",
      "enabled": true,
      "type": "INTERVAL",
      "day": null,
      "name": "stats collection"
    },
    {
      "time": 300,
      "exclusive": true,
      "repeat": 43200,
      "id": "2176eb68-7303-11e7-8cf7-a6107ad3db21",
      "processName": "certificate checker",
      "enabled": true,
      "type": "TIMED",
      "day": null,
      "name": "certificate checker"
    },
    {
      "time": 0,
      "exclusive": true,
      "repeat": 0,
      "id": "ea14ad31-4e46-465f-8f8b-98326af74c96",
      "processName": "south_c",
      "enabled": true,
      "type": "STARTUP",
      "day": null,
      "name": "randomwalk"
    },
    {
      "time": 0,
      "exclusive": true,
      "repeat": 3600,
      "id": "d1631422-9ec6-11e7-abc4-cec278b6b50a",
      "processName": "backup",
      "enabled": false,
      "type": "INTERVAL",
      "day": null,
      "name": "backup hourly"
    },
    {
      "time": 0,
      "exclusive": true,
      "repeat": 0,
      "id": "8d4d3ca0-de80-11e7-80c1-9a214cf093ae",
      "processName": "restore",
      "enabled": true,
      "type": "MANUAL",
      "day": null,
      "name": "restore on demand"
    },
    {
      "time": 0,
      "exclusive": true,
      "repeat": 0,
      "id": "fac8dae6-d8d1-11e7-9296-cec278b6b50a",
      "processName": "backup",
      "enabled": true,
      "type": "MANUAL",
      "day": null,
      "name": "backup on demand"
    }
  ]
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.