Giter Club home page Giter Club logo

dataverse-docker's Introduction

"Archive in a box" package for Dataverse

Note: Docker images are also being developed upstream by the Containerization Working Group. You are welcome to join! The goal is for this project to eventually switch to these images.

This infrastructure with integrated services intented for the research organizations and universities which want to run Community based Data Repository to make their data FAIR (Findable, Accessible, Interoperable and Reusable). The idea of having "Archive in a box" is simple: it should be doing an automatic installation and setting up the complete infrastructure without extra efforts and by institutions having limited technical resources. You can easily turn this demonstration service with default FAKE persistent identifiers into completely operational data archive with production DOIs, mail relay and automatically connected external storages.

Dataverse Docker module

Dataverse Docker module was developed by DANS-KNAW (Data Archiving and Networked Services, the Netherlands) to run Dataverse data repository on Kubernetes and other Cloud services supporting Docker. Current available version of Dataverse in the Docker module is 6.2. The development of Docker module funded by SSHOC project that will create the social sciences and humanities area of the European Open Science Cloud EOSC.

Presentations

"Archive in a box" was presented at the Harvard Dataverse Community Meeting 2022 by Slava Tykhonov (DANS-KNAW), you can watch it on YouTube.

Quick Demo

You can run Dataverse in demo mode using default settings (FAKE DOIs, no mail relay, GUI in English, Cloud storage support disabled):

bash ./demostart.sh

It takes 2-5 minutes to start the complete infastructure (depends from your hardware configuration). You can find Dataverse running on http://localhost:8080.

Basic Technologies

This software package relies on container technologies like Docker and Kubernetes, and can install and manage all dependencies without human interaction. “Archive in a box” uses Docker Compose, a tool for defining and running multi-container Docker applications and allows configuring application's services. All networking issues such as domain name setup, SSL certificates and routing are carried out by Traefik, leading modern reverse proxy and load balancer.

Demo version and development/production service

The demonstration version of Dataverse (“Proof of Concept”) is available out of the box after completing the installation on the local computer or Virtual Machine. It will be shipped with FAKE persistent identifiers, language switch and various content previewers, and other components integrated in the infrastructure. This default installation could be done by people without technical background and allows extensive testing of the basic functionality without spending any time on the system administration tasks related to the Dataverse setup.

To run their Dataverse as a completely operational production service, data providers should fill all settings in the configuration file containing information about their domain name, DOIs settings, the language of web interface, mail relay, external controlled vocabularies and storage. There is also possibility to integrate Docker based custom services in the infrastructure and create own software packages serving the needs of the specific data providers, for example, to integrate a separate Shibboleth container for the federated authentication, install new data previewer or activate data processing pipeline.

Configuration

The configuration is managed in the central place in an environmental variables file called .env, so administrators have no need to modify other files in the software package. It contains all necessary settings required to deploy Dataverse, for example, to set the language or web interface, establish connection to the local database, SOLR search engine, mail relay or external storage.

Startup process

The startup process of “Archive in a box” is simplified and uses init.d folder defined in .env to arrange the order how Dataverse configuration scripts will be running. It contains bash scripts making the services run sequentially and allows easy customization of Dataverse instances according to the requirements of the data providers. All necessary actions like setting up domain name and mail relay, activate previewers, webhook installation etc could be found in this init.d folder. After being restarted, all available datasets in Dataverse will be reindexed automatically.

Custom metadata support

Custom metadata schemes could be easily integrated in Dataverse by using the same mechanism based on the init.d folder. New schema should be declared in .env file first and after script should be added to download the schema as a .tsv file and upload it in Dataverse. As a demonstration of this feature, CESSDA CMM and CLARIN metadata schemes are already integrated and available in the software package, and could be activated in the .env file and in the Dataverse web interface.

External controlled vocabularies plugin

The functionality to support external controlled vocabularies was contributed by DANS-KNAW in the collaboration with Global Dataverse Consortium, and allows connecting Dataverse to vocabularies hosted by Skosmos, ORCID, Wikidata and other service providers. “Archive in a box” has a basic demonstration of this feature and encourages developers from all over the world to implement their own interfaces in order to integrate Dataverse with third party controlled vocabularies.

External storages

Another important feature of “Archive in a box” is external storage support. It has integrated High Performance, Kubernetes Native Object Storage called MinIO and delivers scalable, secure, S3 compatible object storage to every public cloud like Amazon AWS, Google Cloud Platform or Microsoft Azure. It means Dataverse can store data in the Cloud storage instead of local file storage, and different storages could be used for the containers (subdataverses) of different data providers created within the same Dataverse instance.

Webhooks for third parties services

There is a separate webhook implementation for the integration of external services based on Dataverse related actions like dataset modification or publication. For example, automatic FAIR assessment could be done by sending a newly created persistent identifier to the third party service when the user publishes a new dataset. There is also the possibility to integrate Dataverse with various pipelines and workflows dedicated for some specific tasks like named entity recognition in the uploaded files. It can be useful for building GDPR related workflows to get automatic checks if there are some person names present in the data.

The list of main features

  • fully automatic Dataverse deployment with Traefik proxy
  • Dataverse configuration managed through environmental file .env
  • different Dataverse distributives with services on your preference suitable for different use cases
  • external controlled vocabularies support (demo of CESSDA CMM metadata fields connected to Skosmos framework)
  • MinIO storage support
  • data previewers integrated in the distributive
  • startup process managed through scripts located in init.d folder
  • automatic SOLR reindex
  • external services integration PostgreSQL triggers
  • support of custom metadata schemes (CESSDA CMM, CLARIN CMDI, ...)
  • built-in Web interface localization uses Dataverse language pack to support multiple languages out of the box

Project Requirements

  • Docker & Docker-compose

Project Setup

  1. Clone the project
git clone https://github.com/IQSS/dataverse-docker
  1. Copy the environment file following the command in the project root directory
cp .env_sample .env

You can edit .env file and add your configuration for DOI service, mailrelay, S3 connections, etc.

Dataverse distributions

You can use different Dataverse distributions, or distros, and add any Dockerized components depending from your use case. To switch to another distro you should change the variable COMPOSE_FILE in your .env file to the yaml file below. For example, edit .env file, change this variable

COMPOSE_FILE=./docker-compose.yml

and apply the specification to run another Dataverse distro with ssl support:

COMPOSE_FILE=./distros/docker-compose-ssl.yml

Installation

Dataverse Docker module v5.13 uses Træfik, a modern HTTP reverse proxy and load balancer that makes deploying microservices easy. Træfik integrates with your existing infrastructure components (Docker, Swarm mode, Kubernetes, Marathon, Consul, Etcd, Rancher, Amazon ECS, ...) and configures itself automatically and dynamically.

You need to specify the value of "traefikhost" and pub your domain name there (for example, sshopencloud.eu or just localhost) before you'll start to deploy Dataverse infrastructure:

export traefikhost=localhost OR export traefikhost=sshopencloud.eu

and create docker network for all the containers you would expose on the web

docker network create traefik

By default you'll get SSL certificate provided by letsencrypt, please specify your email address if you need https support, for example:

export [email protected]

  • Make sure you have docker and docker-compose installed
  • Run docker-compose up to start Dataverse.

Standalone Dataverse should be running on dataverse-dev.localhost or dataverse-dev.sshopencloud.eu if you've selected the domain.

Default user/password: dataverseAdmin/admin and after you should change it.

Check if Dataverse is already available: curl http://localhost:8080

If it's not coming up please check if all required containers are up: docker ps


CONTAINER ID        IMAGE                                 COMMAND                  CREATED              STATUS              PORTS                                          NAMES
fa727beadf8f   coronawhy/dataverse:5.10                    "/tini -- /bin/sh -c…"   About an hour ago   Up About an hour   0.0.0.0:4848->4848/tcp, :::4848->4848/tcp, 8181/tcp, 0.0.0.0:8009->8009/tcp, :::8009->8009/tcp, 9009/tcp, 0.0.0.0:8088->8080/tcp, :::8088->8080/tcp   dataverse
d4b83af11948   coronawhy/solr:8.9.0                       "docker-entrypoint.s…"   About an hour ago   Up About an hour   0.0.0.0:8983->8983/tcp, :::8983->8983/tcp                                                                                                             solr
bf0478c288cd   containous/whoami                          "/whoami"                About an hour ago   Up About an hour   80/tcp                                                                                                                                                whoami
38d7151cb7cb   postgres:10.13                             "docker-entrypoint.s…"   About an hour ago   Up About an hour   0.0.0.0:5433->5432/tcp, :::5433->5432/tcp                                                                                                             postgres
ce83792a3abd   minio/minio:RELEASE.2021-12-10T23-03-39Z   "/usr/bin/docker-ent…"   About an hour ago   Up About an hour   9000/tcp, 0.0.0.0:9016-9017->9016-9017/tcp, :::9016-9017->9016-9017/tcp                                                                               minio
92c8fa3730a2   traefik:v2.2                               "/entrypoint.sh --ap…"   About an hour ago   Up About an hour   0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp                                                                              traefik

Enjoy Dataverse

Open in your browser the selected domain name (like sshopencloud.eu) or just go to http://localhost:8080

Going from Docker Compose to Kubernetes

If you want to run Dataverse on Kubernetes please use this module

Dataverse web interface localization

The localization of Dataverse was done in CESSDA DataverseEU and others projects. It's maintained by Global Dataverse Community Consortium and available for the following languages:

Citation

For academic use please cite this work as: Vyacheslav Tykhonov, Marion Wittenberg, Eko Indarto, Wilko Steinhoff, Laura Huis in 't Veld, Stefan Kasberger, Philipp Conzett, Cesare Concordia, Peter Kiraly, & Tomasz Parkoła. (2022). D5.5 'Archive in a Box' repository software and proof of concept of centralised installation in the cloud. Zenodo. https://doi.org/10.5281/zenodo.6676391

Warning

If not all languages are coming up in the same time please increase RAM for Docker (not less than 10Gb for 5 languages).

To Do

  • Documentation on the external controlled vocabularies support
  • Dataverse distributives for different projects
  • Information about the process to set up SSHOC Kubernetes

dataverse-docker's People

Contributors

4tikhonov avatar antonpolishko avatar carlsonp avatar dependabot[bot] avatar ekoi avatar kaitlinnewson avatar pameyer avatar pdurbin avatar pkiraly avatar pmauduit avatar skasberger avatar wilkos-dans avatar xibriz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dataverse-docker's Issues

Dataverse 4.9.x & Solr 7

With Dataverse 4.9, the Solr version changed from 4.x to the latest 7.x version.

Is there any plan to get these versions into the docker setup?

Branding Dataverse

Hi there,

First, thanks for the work dockerizing dataverse, it help a lot!

There is a non-invasive way to customize theme Dataverse by using dataverse-docker?

What I mean by non-invasive is to avoid changing docker-compose of dockerfiles, is it possible to deploy as is, then customize it from "outside", not going on the internals of dataverse-docker modifying configs os scripts?

Thanks in advance!

getting error when I'm trying to publish dataset.

I get this message when I trying to publish dataset.
"This dataset may not be published due to an error when contacting the DataCite Service. Please try again. If you believe this is an error, please contact..."
I sow this issue so I tried this :
./asadmin create-jvm-options "-Ddataverse.siteUrl=http\://my server IP\:8085" after exec to the dataverse container. the command went successfully but still getting the same error. (tried to restart the containers.) also, I tried
asadmin create-jvm-options "-Ddataverse.siteUrl=http\://localhost\:8085" . no change

please help! Thank you!

PGPASSWORD env variable for dataverse container doesn't apply

the variable in docker-compose.yml "PGPASSWORD" is not used inside dataverse container. Instead the password is hardcoded inside the container ekoindarto/dataverse-cvm:5.0

/opt/payara/scripts/init_2_conf_payara.sh
...
echo 'set resources.jdbc-connection-pool.dvnDbPool.property.password=dvnsecret' >> ${DV_POSTBOOT}
...

something like next is expected instead

echo "set resources.jdbc-connection-pool.dvnDbPool.property.password=${PGPASSWORD}" >> ${DV_POSTBOOT}

Dataverse port for previewers

After adding previewers, they try to look for port 8080.
We can't use port 8080 currently, because another application uses that port.

In docker-compose, there is an option to set the port to "8081:8080", and Dataverse itself works fine with it.
We tried to change the environment setting "socket_port" to 8081, but that doesn't change the port of the previewers.

image

Docker fails to start if there is a running Apache httpd server

When I run docker on a machine which already has a running Apache or other web server instance, the process failes with an error message like this:

proxy_1 (5b1a33808c9e3a3cece64884ff0fca5aab67db94423f27100077efd7390b8cf1): Error starting userland proxy: listen tcp 0.0.0.0:80: bind: address already in use

It would be great if the README would mention conflicting services, such as Apache, PostgreSQL or a Java server.

postgresql - necessity to create the db by the dataverse container ?

The dataverse container tries to create the database by connecting onto the postgresql server (from the postgresql container) at first run.

dataverse_1  | Checking if we can talk to Postgres as the admin user...
postgres_1   | FATAL:  password authentication failed for user "postgres"
postgres_1   | DETAIL:  Password does not match for user "postgres".
postgres_1   | 	Connection matched pg_hba.conf line 95: "host all all all md5"
dataverse_1  | (Tried executing: PGPASSWORD=secret; export PGPASSWORD; /usr/bin/psql -h postgres -p 5432 -U postgres -d postgres -c 'SELECT * FROM pg_roles' > /dev/null 2>&1) 

But is it necessary, as the env variables on the postgres container should already ensure that the database exist ?

By the way, digging in the entrypoint of the dataverse container, I could not find where the perl script (responsible of the psql command failing above) is supposed to be called exactly:

https://github.com/IQSS/dataverse-docker/blob/master/dataversedock/testdata/scripts/installer/install

Maybe we should consider that the db already exist and not try to create it from the dataverse container.

Publishing dataset

On fresh installation using the default DOI settings this error appears when trying to publish :

Error – This dataset may not be published because the EZID Service is currently inaccessible. Please try again. Does the issue continue to persist? If you believe this is an error, please contact Root Support for assistance.

a demo dataset doi :

doi:10.5072/FK2/J8FOEQ

Identical docker-compose files

Hello,

There is no difference between distributives/docker-compose-ssl.yml and distributives/docker-compose.yml

The docker-compose.yml file is suppose to be HTTP only.

Shibboleth and Dataverse

Hello guys,

I'm heading to install Shibboleth in my dataverse... however idk from where to start to do this using docker... Any suggestions ?
My first question is if shibboleth must be in dataverse container, or if it must be in another container....

Thanks in advance

postgresql - including the init.sql in the docker image ?

Looking at the docker-compose file, we can see that every volumes mounted into the containers will be stored in the data/ subdirectory, except for postgresql, which also mounts the init.sql script as a volume:
https://github.com/IQSS/dataverse-docker/blob/master/docker-compose.yml#L11

Since we have full control over how the postgresql image is built, why not adding the file (via the ADD / COPY statement) directly in the Dockerfile ? This will allow to keep only the "data" / "runtime" aspects as volumes in the docker-compose and improve readability. It might be just a detail though.

API calls which should be called from local machine are not accessible from the host

Once docker is running, those API calls which should be called from local machine are not accessible from the host:

$ curl -i -H X-Dataverse-key:$API_TOKEN http://localhost/api/admin/datafiles/integrity/fixmissingoriginalsizes
HTTP/1.1 301 Moved Permanently
Location: https://localhost/api/admin/datafiles/integrity/fixmissingoriginalsizes
Date: Thu, 24 Mar 2022 13:10:37 GMT
Content-Length: 17
Content-Type: text/plain; charset=utf-8
$ curl -i -k -H X-Dataverse-key:$API_TOKEN https://localhost/api/admin/datafiles/integrity/fixmissingoriginalsizes
HTTP/2 404
content-type: text/plain; charset=utf-8
x-content-type-options: nosniff
content-length: 19
date: Thu, 24 Mar 2022 13:10:59 GMT

404 page not found

Update README.md

It seems like the README has not been updated for the Dataverse 5 release. There are three things I think needs some improvements

  1. at least in Docker compose v1.25.4, the following command doesn't work
    ´docker-compose up -f docker-compose-local.yml´
    the -f parameter and up command should be changed:
    ´docker-compose -f docker-compose-local.yml up´

  2. the README says it runs 3 container, solr, dataverse and db. For me it runs 5 containers: dataverse, solr, postgres, dataverse-docker_reverse-proxy_1, and whoami.

  3. README doesn't mention the URL(s) of the service

Dataverse Docker installation error postgres authentification

Dear all,

I am trying to setup a local docker dataverse installation. I got a fatal error during the installation. Error is as follows:

postgres         | 2021-09-27 07:34:56.549 UTC [86] FATAL:  password authentication failed for user "dvnuser"
postgres         | 2021-09-27 07:34:56.549 UTC [86] DETAIL:  Role "dvnuser" does not exist.
postgres         |     Connection matched pg_hba.conf line 95: "host all all all md5"

The steps I took were:

  1. I downloaded https://github.com/IQSS/dataverse-docker
  2. I installed docker, and docker composed
  3. export traefikhost=localhost
  4. docker network create traefik
  5. export useremail=[email protected]
  6. docker-compose -f docker-compose-local.yml up

I tried it on fedora 34 and an almalinux VM. I asked this in this google group, and got feedback, that it was reproduced. It seems like dataverse is running, I can connect to to it and upload files. But the container that are up, do not match with what is given in the installation guide.

CONTAINER ID   IMAGE                        COMMAND                  CREATED          STATUS          PORTS                                                                                                                 NAMES
e029c88248d8   coronawhy/dataverse:5.5      "/tini -- /bin/sh -c…"   10 minutes ago   Up 10 minutes   8181/tcp, 0.0.0.0:4848->4848/tcp, :::4848->4848/tcp, 9009/tcp, 0.0.0.0:8085->8080/tcp, :::8085->8080/tcp              dataverse
7725c61098a2   traefik:v2.2                 "/entrypoint.sh --ap…"   10 minutes ago   Up 10 minutes   0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp, 0.0.0.0:8089->8080/tcp, :::8089->8080/tcp   dataverse-docker-master_reverse-proxy_1
3d227d76e023   containous/whoami            "/whoami"                10 minutes ago   Up 10 minutes   80/tcp                                                                                                                whoami
02ce0dea85d3   ekoindarto/solr-cvm:latest   "/tini -- /tmp/entry…"   10 minutes ago   Up 10 minutes   0.0.0.0:8983->8983/tcp, :::8983->8983/tcp                                                                             solr
a13911c76f7f   postgres:10.13               "docker-entrypoint.s…"   10 minutes ago   Up 10 minutes   0.0.0.0:5433->5432/tcp, :::5433->5432/tcp                                                                             postgres

Thanks in advance.
Best regards

Apache-Glassfish-Docker

Hi, guys, I need some help...
This is the scenario: I have dataverse project in Docker, with shibboleth. There are 4 containers:

  • dataverse, containing Glassfish;
  • Apache, containing Shibboleth;
  • Postgresql, and;
  • Solr.

The problem is that I cannot access dataverse via https://DOMAIN_NAME:443 (or :80), i.e., via apache. I can only access via :8080, i.e., Glassfish...

I already modified ssl.conf, httpd.conf....

Does anyone have any idea what I should do?

Thanks in advance.

Alexandre

Internal Server Error - (dataverse column "uri" does not exist)

After adding a new version of dataverse.war to dataverse-docker/dataversedock/dv/deps and re-run the image, I get this error:

Internal Server Error - An unexpected error was encountered, no more information is available.

Screenshot from 2019-05-11 07-33-09

From the glassfish logs, the problem is with a missing database column

Local Exception Stack: 
Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: column "requestor_id" does not exist
  Position: 65
Error Code: 0
Call: SELECT ID, EMAILED, OBJECTID, READNOTIFICATION, SENDDATE, TYPE, REQUESTOR_ID, USER_ID FROM USERNOTIFICATION WHERE (USER_ID = ?) ORDER BY SENDDATE DESC
	bind => [1 parameter bound]
Query: ReadAllQuery(referenceClass=UserNotification sql="SELECT ID, EMAILED, OBJECTID, READNOTIFICATION, SENDDATE, TYPE, REQUESTOR_ID, USER_ID FROM USERNOTIFICATION WHERE (USER_ID = ?) ORDER BY SENDDATE DESC")

I tried the solution stated in Troubleshooting page from the dataverse guide, but it seems that the dbUser don't has permission to perform this solution in the docker instance.

ERROR: must be owner of relation datasetfieldtype

It is not clear when it is "ready" to use

When I run it I receive a lots of messages in the log, but it is not clear when the service is ready to be uses? There is no log message saying: "Happy using" or "You can access dataverse at http://...."

Polish translation of Dataverse

Yesterday I noticed a Polish translation of Dataverse at CeON/dataverse#29 by @madryk . Thank you!

@MrK191 and chatted briefly about it at http://irclog.iq.harvard.edu/dataverse/2019-02-01 and he said it was fine to go ahead and create this issue. As of 5942a0d I don't see a Polish translation at https://github.com/IQSS/dataverse-docker/tree/5942a0de5785600e3de08d450ce2259e7ff7f7b3/dataversedock/dataverse-property-files so I think it's new.

I assume at some point we will close this issue and track translations somewhere else (perhaps under https://github.com/GlobalDataverseCommunityConsortium ?) but I'm not sure what the plan is. Please see also:

Dockerfile of docker-image (dataverse container)

Hello guys,

I've deployed the project in docker. I did a lot of customizations in my dataverse's Dockerfile.
However now I need to do an upgrade since my dataverse version uses glassfish 4 that has a vulnerability identified by the securety team of my department.
using this project I can solve the vulnerability problem, however I am struggling when I'm trying to do all the customizations I've done in my Dockerfile. I searched for the Dockerfile (and other files) used to build the image coronawhy/dataverse:5.1.1-cv but I have no sucess.... There is a way to get them (Dockerfile and other files used) ?

best regards,

Alexandre

Needing help with Solr

Good morning guys,

when I try to up the containers I get the folling error message:

Creating solr ... error

ERROR: for solr Cannot start service solr: b'OCI runtime create failed: contain er_linux.go:348: starting container process caused "exec: \"/entrypoint.sh\": permission denied": unknown'

How can I solve it?
Thanks in advance!

automatic logout

the system forwards to the new page unlogin after clicking on some button. for example:
login > add data.throw the user to the login page.
or after login and entering to some datasets. need to log in again.
Does anyone know why?
Thanks!

I also opened an issue here
no answer so far.

Cannot connect to S3-Compatible object storage

Hi all !

I'm unable to connect to my swift object storage (S3 compatible). I'm having this error :

om.amazonaws.AmazonClientException: Cannot instantiate a S3 client; check your AWS credentials and region

I'm confused because I can call the s3 swift api with awscli, so my config et credentials files are good.

This is my configuration:

        dataverse.files.s3.profile=default
        dataverse.files.storage-driver-id=s3
        dataverse.files.s3.type=s3
        dataverse.files.s3.bucket-name=datachallenge2020
        dataverse.files.s3.custom-endpoint-region=gra
        dataverse.files.s3.path-style-access=true
        dataverse.files.s3.custom-endpoint-url=https://s3.gra.cloud.ovh.net
        dataverse.files.s3.label=s3
        dataverse.files.s3.payload-signing=true

config file:

[default]
region = gra

credentials file:

[default]
aws_access_key_id = *******************************
aws_secret_access_key = *****************************

I put these configuration files in the home directory of the user "payara". I think this user runs the glassfish server:

Any suggestions?

Maeva

How to change languge versions of the interface?

In the last section of the README there is an emphasis on multilinguality, but there is no description on how to change language versions of the interface? Is it available out of the box (that's what README suggests)? If not, please describe the steps to add language files.

How to add three languages ​​in dataverse

In the .env file the MAINLANG option accepts only one language input.
Is there any technique to pass more than one language in the MAINLANG variable

example: br, es, fr

MAINLANG=br

Best Regards,

Marcos Alexandre

Update requirements

@4tikhonov @pdurbin
as mentioned in : IQSS/dataverse#4665

on fresh ubuntu 16.04 installation , The current script on

dataverse-docker/dataversedock/step1.sh

line 16 :

unzip glassfish-4.1.zip

requires to install unzip by :

sudo apt-get install unzip

else the initial.bash will fail silently (doesn't add glassfish dependency and docker compose will fail on step 7 )

Also i noticed the use of wget fails as well on macOS fails since it's not installed on macOS default installation

Installing TwoRavens in IQSS/dataverse-docker

Dear all,
I am very glad that I could successfully install Dataverse using "DataverseEU Docker module". It is smoothly on
localhost:8085/dataverse/root . I am using Ubuntu 18.04

Now I am trying to install TwoRavens following the instructions at
http://guides.dataverse.org/en/4.16/installation/r-rapache-tworavens.html

When I run the TwoRavens install script by using

sudo ./install.pl
and enter these values, I get an error (please see below):

Directory where TwoRavens is installed: [/var/www/html/dataexplore]

Apache config directory: [/etc/httpd] [/etc/apache2]

Apache Web Root directory: [/var/www/html]

TwoRavens/rApache URL: [http://sharif-VirtualBox:80] [http://localhost:80]
Sorry, this is not a valid URL!
Please enter a valid TwoRavens/rApache URL.
(or ctrl-C to exit the installer)

Could please help. Thank you very much.

Kind regards,
Sharif

Curl command yields status:"error", message:"Endpoint available from localhost only

Hello,
I tried to execute this curl command

sudo curl -X POST -H 'Content-type: application/json' http://localhost:8085/api/admin/externalTools -d \ "{ "displayName":"View Image", "description":"Preview an image file.", "scope":"file", "type":"explore", "toolUrl":"https://qualitativedatarepository.github.io/dataverse-previewers/previewers/ImagePreview.html\", "toolParameters": { "queryParameters":[ {"fileid":"{fileId}"}, {"siteUrl":"{siteUrl}"}, {"key":"{apiToken}"}, {"datasetid":"{datasetId}"}, {"datasetversion":"{datasetVersion}"} ] }, "contentType":"image/png" }"

but I get this message:
{ status:"error", message:"Endpoint available from localhost only. Please contact the dataverse administrator"}

I have installed Dataverse using "DataverseEU Docker module". It is running on a linux (Ubuntu)
at http://localhost:8085/dataverse/root

Could you please suggest how to fix it?

Thank you very much.

Regards,
Sharif

Any way to know token for default user?

Hello, first of all thanks for this repository!
I contact you as i'm trying to use dataverse-docker as QA integration tests for a Sword API client, configured in Travis CI. The docker image works well, but to trigger integration tests I need to know and set a valid token. Is there any way to get token for default user/password, or maybe a default token for testing?
Thanks
Emmanuel

customizing navbar logo error because web docroot is not found

The instructions on how to customize navbar doesn't work on dataverse-docker:

I don't known why but dataverse-docker is using non-default path for docroot, while the official document (link above) says that the docroot is placed at /usr/local/payara5/glassfish/domains/domain1/docroot the dataverse-docker is not using it, it seems that dataverse-docker docroot is at /opt/payara/appserver/glassfish/domains/production/applications/dataverse.

The real problem is: how can I copy (or map) files from my host machine inside the dataverse container docroot at /opt/payara/appserver/glassfish/domains/production/applications/dataverse in order to customize the navbar logo by running the suggested command documented on official documentation on how to customize branding/navbar/layout.

Command below copied as-is from dataverse tutorial just for clarification:

curl -X PUT -d '/logos/navbar/logo.png' http://localhost:8080/api/admin/settings/:LogoCustomizationFile

What I tried but not works was mapping a volume on docker-compose (I did it using Multiple Compose files) to be possible having my theme logo file inside dataverse container on docroot folder.

    volumes:
      - ./my-theme:/opt/payara/appserver/glassfish/domains/production/applications/my-theme

But it crash totally the dataverse and service doesn't goes up.

I tried also mapping to the path documented on dataverse doc at /usr/local/payara5/glassfish/domains/domain1/docroot but it is not being used at all by webserver if I understood well.

Get several errors in dockerized version of Dataverse...

First one --

"Unable to find a single dataverse using alias "root": javax.persistence.NoResultException: getSingleResult() did not retrieve any entities.|#]"

This is utilizing the "docker-compose-local.yml" file. Followed instructions exactly as indicated just to see how the application looks.

Also got this error message; second one --

"Unable to connect"

Any thoughts?

Error messages during startup

When I run the service (from clean state) I receive a number of error messages, such as:

dataverse        | [#|2022-03-24T12:42:35.649+0000|SEVERE|Payara 5.2021.1|javax.enterprise.system.core|_ThreadID=110;_ThreadName=admin-thread-pool::admin-listener(1);_TimeMillis=1648125755649;_LevelValue=1000;_MessageID=NCLS-CORE-00003;|
dataverse        |   Exception while running a command
dataverse        | java.lang.IllegalArgumentException: Invalid property syntax, missing property value: 8080
dataverse        |     at org.glassfish.common.util.admin.MapInjectionResolver.convertStringToProperties(MapInjectionResolver.java:507)
                    ...

or

dataverse        | [#|2022-03-24T13:44:57.491+0000|SEVERE|Payara 5.2021.1||_ThreadID=198;_ThreadName=GF Domain XML;_TimeMillis=1648129497491;_LevelValue=1000;|
dataverse        |   Error in dealiasing the password ${ALIAS=rserve_password_alias}: Alias  rserve_password_alias does not exist|#]

or

postgres         | 2022-03-24 13:45:09.383 UTC [32] STATEMENT:  CREATE TABLE EXTERNALTOOLTYPE (ID  SERIAL NOT NULL, TYPE VARCHAR(255) NOT NULL, EXTERNALTOOL_ID BIGINT NOT NULL, PRIMARY KEY (ID))
dataverse        | [#|2022-03-24T13:45:09.387+0000|WARNING|Payara 5.2021.1|javax.org.glassfish.persistence.org.glassfish.persistence.common|_ThreadID=1;_ThreadName=main;_TimeMillis=1648129509387;_LevelValue=900;|
dataverse        |   PER01000: Got SQLException executing statement "CREATE TABLE EXTERNALTOOLTYPE (ID  SERIAL NOT NULL, TYPE VARCHAR(255) NOT NULL, EXTERNALTOOL_ID BIGINT NOT NULL, PRIMARY KEY (ID))": org.postgresql.util.PSQLException: ERROR: relation "externaltooltype" already exists|#]
dataverse        |
postgres         | 2022-03-24 13:45:09.390 UTC [32] ERROR:  relation "index_externaltooltype_externaltool_id" already exists

I am not sure which issue comes from Docker, and which is part of Dataverse. It would worth to compare the log with a log of a clean official installation.

Language files outdated

Dear Dataverse Community, I am writing (even dublicating) here since the GlobalDataverseCommunity seems paused the development of the language packs.

we just installed Dataverse at Bonn University (Germany) as a test-repository. Unfortunately, German translations (and probably many other languages, since last commits seem to be far in the past).

We helped ourselves using existing file and Google translations (which does not work bad at all). Therefore, I have some questions to this community:

  1. Is the repository still maintained, or are other translation activities out there?
    1. We are ready to share our file (keeping in mind that some translations may still be odd due to automatic procedure). In that case, would you like to let me update the German version of the file, or I can send it to you.
    1. Are you interested in our strategy for (semi) (auto) translating the files? It would take around 30 minutes for a language to complete, the accuracy can be further improved later. Here is the Google sheet I used to translate. It is probably self-explanatory, but I can also provide support in understanding the procedure. https://docs.google.com/spreadsheets/d/15BW2MH5nJz_RAtiAmWBp3xt4RVNf9z5YF2mAEWvMKgQ/edit#gid=832324258

PS: In my opinion, sooner or later we will not be able to maintain such projects manually (more than 2500 keys have to be considered in dozens of languages) and should anyway accept using automatic procedures, if we want to keep this up-to-date. The Google algorithms are getting amazingly better and better. We could even think about some automatic procedure using their API and just send deltas to users for final corrections. Tell me if you are interested in creating something like that.

dataverse glasshfish not persistent

The dataverse container where we have the glassfish does not have a volume to it.
I tried many many times mapping a volume to the glassfish folder and the problems is that
glassfish will not autostart...
now the container restart I lose many configurations like:
-Logo
-SignUp button disabled
-StorageOptions - custom created
-Remote API calls permission - lost

has anyone being able to set a volume for persistent on the glassfish configs?

I cannot publish any dataverse or dataset

Hi Guys,

I successfully installed dataverse. But when I do my loging or try add a dataset/dataverse the page gives me the following message:

Not Authorized - You are not authorized to view this page. If you believe this is an error, please contact Root Support for assistance.

What should I do to solve this?

Best regards

Using asadmin e.g. to get server variables require a username

In administering the service you should be able to use Payara's asadmin command line tool, but by default it fails:

$ docker container exec -ti dataverse /bin/bash
# /opt/payara/appserver/bin/asadmin list-jvm-options | grep dataverse
Authentication failed for user: null 

(Usually, this means invalid user name and/or password)

If one tries it from the host, asadmin asks for credentials:

$ docker container exec -ti dataverse /opt/payara/appserver/bin/asadmin list-jvm-options
Enter admin user name:

Solution: Add the admin user name ('admin') and password ('admin') to README

But pipeline still not works. To prevent to be asked again, you should do the following steps:

  1. enter the container command line environment
$ docker container exec -ti dataverse /bin/bash
  1. enter credentials ('admin' and 'admin')
# /opt/payara/appserver/bin/asadmin login
Enter admin user name [Enter to accept default]>
Enter admin password>
# exit
  1. Test if it works:
$ docker container exec -ti dataverse /opt/payara/appserver/bin/asadmin list-jvm-options | grep dataverse

docker-compose dataverse 4.15

Hello,
I would like to know if there are any plans to have the latest version 4.15 on docker-compose.

I did try on my end and it is working with some errors.

This was our expierence here at www.iit.it :
1) Replaced all references from postres 9.3 to 9.6 and then I got
Right after change postgresql version from 9.3 to 9.6 we had this error:
"The data directory was initialized by PostgreSQL version 9.3, which is not compatible with this version 9.6""
Seems we were still missing the correct DB init data and scripts

2)Fix the version downloaded in init.bash
Same for the .war file
#Force Download version 4.15
wget "https://github.com/IQSS/dataverse/releases/download/v4.15/dvinstall.zip" -O ./postgresql/dvinstall.zip

#step2 force download here
wget https://github.com/IQSS/dataverse/releases/download/v4.15/dataverse-4.15.war -O ./dataversedock/dv/deps/dataverse.war
inside postgresql testdata we had to get the script

We got the testdata/ from
jhove.conf jhoveConfig.xsd and schema.xml
https://github.com/IQSS/dataverse/releases/download/v4.15/dvinstall.zip
testdata/scripts
from
git clone --branch v4.15 https://github.com/IQSS/dataverse.git
inside dataverse-4.15/scripts
I think we need help with the correct Database files ...
Still when starting the application we saw the error:

db | ERROR: relation "sequence" does not exist at character 22
db | STATEMENT: SELECT SEQ_NAME FROM SEQUENCE WHERE SEQ_NAME <> SEQ_NAME

"There are current some issues with the menu names showing quotes we are currently testing and will report any more errors."

404 Page not found when starting docker

I was trying to start up dataverse following the instructions from the readme page, everything went fine but when accessing the end point dataverse-dev.localhost , it shows the following page not found.
Is there a missing startup process that I missed out on?

image

Steps taken:

  1. Ran the following export commands
export traefikhost=localhost
export [email protected]
  1. Create network by running command
    sudo docker network create traefik
  2. Start up docker compose for local
    docker-compose -f docker-compose-local.yml up
  3. Tried to access on browser: dataverse-dev.localhost & localhost

It seems like PGadmin is working fine as I was able to access the endpoint using port localhost:8086.. only dataverse shows the error

image

docker ps
image

Some docker logs when running docker compose

dataverse        | [#|2021-08-20T06:02:13.056+0000|SEVERE|Payara 5.2020.3|org.eclipse.persistence.session./file:/opt/payara/appserver/glassfish/domains/production/applications/ejb-timer-service-app/WEB-INF/classes/___EJB__Timer__App.ejb|_ThreadID=20;_ThreadName=RunLevelControllerThread-1629439272500;_TimeMillis=1629439333056;_LevelValue=1000;|
dataverse        |   Local Exception Stack: 
dataverse        | Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.7.6.payara-p1): org.eclipse.persistence.exceptions.DatabaseException
dataverse        | Internal Exception: java.sql.SQLException: Error in allocating a connection. Cause: Connection could not be allocated because: The connection attempt failed.
dataverse        | Error Code: 0
dataverse        | 	at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:318)
dataverse        | 	at org.eclipse.persistence.sessions.JNDIConnector.connect(JNDIConnector.java:150)
dataverse        | 	at org.eclipse.persistence.sessions.DatasourceLogin.connectToDatasource(DatasourceLogin.java:172)
dataverse        | 	at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.setOrDetectDatasource(DatabaseSessionImpl.java:225)
dataverse        | 	at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.loginAndDetectDatasource(DatabaseSessionImpl.java:809)
dataverse        | 	at org.eclipse.persistence.internal.jpa.EntityManagerFactoryProvider.login(EntityManagerFactoryProvider.java:256)
dataverse        | 	at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:772)
dataverse        | 	at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.getAbstractSession(EntityManagerFactoryDelegate.java:222)
dataverse        | 	at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.createEntityManagerImpl(EntityManagerFactoryDelegate.java:330)
dataverse        | 	at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManagerImpl(EntityManagerFactoryImpl.java:350)
dataverse        | 	at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManager(EntityManagerFactoryImpl.java:331)
dataverse        | 	at com.sun.enterprise.container.common.impl.EntityManagerWrapper._getDelegate(EntityManagerWrapper.java:197)
dataverse        | 	at com.sun.enterprise.container.common.impl.EntityManagerWrapper.createNamedQuery(EntityManagerWrapper.java:517)
dataverse        | 	at org.glassfish.ejb.persistent.timer.TimerBean.findTimersByOwnerAndState(TimerBean.java:209)
dataverse        | 	at org.glassfish.ejb.persistent.timer.TimerBean.findActiveTimersOwnedByThisServer(TimerBean.java:530)
dataverse        | 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
dataverse        | 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
dataverse        | 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
dataverse        | 	at java.lang.reflect.Method.invoke(Method.java:498)
dataverse        | 	at org.glassfish.ejb.security.application.EJBSecurityManager.runMethod(EJBSecurityManager.java:588)
dataverse        | 	at org.glassfish.ejb.security.application.EJBSecurityManager.invoke(EJBSecurityManager.java:408)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.invokeBeanMethod(BaseContainer.java:4796)
dataverse        | 	at com.sun.ejb.EjbInvocation.invokeBeanMethod(EjbInvocation.java:665)
dataverse        | 	at com.sun.ejb.containers.interceptors.AroundInvokeChainImpl.invokeNext(InterceptorManager.java:834)
dataverse        | 	at com.sun.ejb.EjbInvocation.proceed(EjbInvocation.java:615)
dataverse        | 	at com.sun.ejb.containers.interceptors.SystemInterceptorProxy.doCall(SystemInterceptorProxy.java:163)
dataverse        | 	at com.sun.ejb.containers.interceptors.SystemInterceptorProxy.aroundInvoke(SystemInterceptorProxy.java:140)
dataverse        | 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
dataverse        | 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
dataverse        | 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
dataverse        | 	at java.lang.reflect.Method.invoke(Method.java:498)
dataverse        | 	at com.sun.ejb.containers.interceptors.AroundInvokeInterceptor.intercept(InterceptorManager.java:888)
dataverse        | 	at com.sun.ejb.containers.interceptors.AroundInvokeChainImpl.invokeNext(InterceptorManager.java:833)
dataverse        | 	at com.sun.ejb.containers.interceptors.InterceptorManager.intercept(InterceptorManager.java:375)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.__intercept(BaseContainer.java:4768)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.intercept(BaseContainer.java:4756)
dataverse        | 	at com.sun.ejb.containers.EJBLocalObjectInvocationHandler.invoke(EJBLocalObjectInvocationHandler.java:212)
dataverse        | 	at com.sun.ejb.containers.EJBLocalObjectInvocationHandlerDelegate.invoke(EJBLocalObjectInvocationHandlerDelegate.java:90)
dataverse        | 	at com.sun.proxy.$Proxy487.findActiveTimersOwnedByThisServer(Unknown Source)
dataverse        | 	at org.glassfish.ejb.persistent.timer.PersistentEJBTimerService.restoreEJBTimers(PersistentEJBTimerService.java:370)
dataverse        | 	at org.glassfish.ejb.persistent.timer.PersistentEJBTimerService.resetEJBTimers(PersistentEJBTimerService.java:1401)
dataverse        | 	at com.sun.ejb.containers.EJBTimerService.initPersistentTimerService(EJBTimerService.java:436)
dataverse        | 	at com.sun.ejb.containers.EJBTimerService.getEJBTimerService(EJBTimerService.java:257)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.initialize(BaseContainer.java:857)
dataverse        | 	at java.util.ArrayList.forEach(ArrayList.java:1257)
dataverse        | 	at org.glassfish.ejb.startup.EjbApplication.initialize(EjbApplication.java:250)
dataverse        | 	at org.glassfish.internal.data.EngineRef.initialize(EngineRef.java:189)
dataverse        | 	at java.lang.Iterable.forEach(Iterable.java:75)
dataverse        | 	at org.glassfish.internal.data.ApplicationInfo.lambda$initialize$0(ApplicationInfo.java:395)
dataverse        | 	at java.util.ArrayList.forEach(ArrayList.java:1257)
dataverse        | 	at org.glassfish.internal.data.ApplicationInfo.initialize(ApplicationInfo.java:395)
dataverse        | 	at com.sun.enterprise.v3.server.ApplicationLifecycle.initialize(ApplicationLifecycle.java:618)
dataverse        | 	at com.sun.enterprise.v3.server.ApplicationLoaderService.postConstruct(ApplicationLoaderService.java:332)
dataverse        | 	at org.jvnet.hk2.internal.ClazzCreator.postConstructMe(ClazzCreator.java:303)
dataverse        | 	at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:351)
dataverse        | 	at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:463)
dataverse        | 	at org.glassfish.hk2.runlevel.internal.AsyncRunLevelContext.findOrCreate(AsyncRunLevelContext.java:281)
dataverse        | 	at org.glassfish.hk2.runlevel.RunLevelContext.findOrCreate(RunLevelContext.java:65)
dataverse        | 	at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2102)
dataverse        | 	at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:93)
dataverse        | 	at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:67)
dataverse        | 	at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.oneJob(CurrentTaskFuture.java:1213)
dataverse        | 	at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.run(CurrentTaskFuture.java:1144)
dataverse        | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
dataverse        | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
dataverse        | 	at java.lang.Thread.run(Thread.java:748)
dataverse        | Caused by: java.sql.SQLException: Error in allocating a connection. Cause: Connection could not be allocated because: The connection attempt failed.
dataverse        | 	at com.sun.gjc.spi.base.AbstractDataSource.getConnection(AbstractDataSource.java:119)
dataverse        | 	at org.eclipse.persistence.sessions.JNDIConnector.connect(JNDIConnector.java:138)
dataverse        | 	... 64 more
dataverse        | Caused by: javax.resource.spi.ResourceAllocationException: Error in allocating a connection. Cause: Connection could not be allocated because: The connection attempt failed.
dataverse        | 	at com.sun.enterprise.connectors.ConnectionManagerImpl.internalGetConnection(ConnectionManagerImpl.java:319)
dataverse        | 	at com.sun.enterprise.connectors.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:196)
dataverse        | 	at com.sun.enterprise.connectors.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:171)
dataverse        | 	at com.sun.enterprise.connectors.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:166)
dataverse        | 	at com.sun.gjc.spi.base.AbstractDataSource.getConnection(AbstractDataSource.java:113)
dataverse        | 	... 65 more
dataverse        | Caused by: com.sun.appserv.connectors.internal.api.PoolingException: Connection could not be allocated because: The connection attempt failed.
dataverse        | 	at com.sun.enterprise.resource.pool.datastructure.RWLockDataStructure.addResource(RWLockDataStructure.java:103)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.addResource(ConnectionPool.java:287)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.createResourceAndAddToPool(ConnectionPool.java:1532)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.createResources(ConnectionPool.java:957)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.initPool(ConnectionPool.java:235)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.internalGetResource(ConnectionPool.java:528)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.getResource(ConnectionPool.java:386)
dataverse        | 	at com.sun.enterprise.resource.pool.PoolManagerImpl.getResourceFromPool(PoolManagerImpl.java:244)
dataverse        | 	at com.sun.enterprise.resource.pool.PoolManagerImpl.getResource(PoolManagerImpl.java:171)
dataverse        | 	at com.sun.enterprise.connectors.ConnectionManagerImpl.getResource(ConnectionManagerImpl.java:360)
dataverse        | 	at com.sun.enterprise.connectors.ConnectionManagerImpl.internalGetConnection(ConnectionManagerImpl.java:307)
dataverse        | 	... 69 more
dataverse        | Caused by: com.sun.appserv.connectors.internal.api.PoolingException: Connection could not be allocated because: The connection attempt failed.
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.createSingleResource(ConnectionPool.java:937)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.createResource(ConnectionPool.java:1209)
dataverse        | 	at com.sun.enterprise.resource.pool.datastructure.RWLockDataStructure.addResource(RWLockDataStructure.java:98)
dataverse        | 	... 79 more
dataverse        | Caused by: com.sun.appserv.connectors.internal.api.PoolingException: Connection could not be allocated because: The connection attempt failed.
dataverse        | 	at com.sun.enterprise.resource.allocator.LocalTxConnectorAllocator.createResource(LocalTxConnectorAllocator.java:110)
dataverse        | 	at com.sun.enterprise.resource.pool.ConnectionPool.createSingleResource(ConnectionPool.java:920)
dataverse        | 	... 81 more
dataverse        | Caused by: javax.resource.spi.ResourceAllocationException: Connection could not be allocated because: The connection attempt failed.
dataverse        | 	at com.sun.gjc.spi.DSManagedConnectionFactory.createManagedConnection(DSManagedConnectionFactory.java:131)
dataverse        | 	at com.sun.enterprise.resource.allocator.LocalTxConnectorAllocator.createResource(LocalTxConnectorAllocator.java:87)
dataverse        | 	... 82 more
dataverse        | Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
dataverse        | 	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:297)
dataverse        | 	at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
dataverse        | 	at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:211)
dataverse        | 	at org.postgresql.Driver.makeConnection(Driver.java:459)
dataverse        | 	at org.postgresql.Driver.connect(Driver.java:261)
dataverse        | 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
dataverse        | 	at java.sql.DriverManager.getConnection(DriverManager.java:247)
dataverse        | 	at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:98)
dataverse        | 	at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:83)
dataverse        | 	at org.postgresql.ds.PGConnectionPoolDataSource.getPooledConnection(PGConnectionPoolDataSource.java:58)
dataverse        | 	at org.postgresql.ds.PGPoolingDataSource.getPooledConnection(PGPoolingDataSource.java:381)
dataverse        | 	at org.postgresql.ds.PGPoolingDataSource.getConnection(PGPoolingDataSource.java:326)
dataverse        | 	at com.sun.gjc.spi.DSManagedConnectionFactory.createManagedConnection(DSManagedConnectionFactory.java:117)
dataverse        | 	... 83 more
dataverse        | Caused by: java.net.UnknownHostException: postgres
dataverse        | 	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184)
dataverse        | 	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
dataverse        | 	at java.net.Socket.connect(Socket.java:607)
dataverse        | 	at org.postgresql.core.PGStream.<init>(PGStream.java:81)
dataverse        | 	at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:93)
dataverse        | 	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:197)
dataverse        | 	... 95 more
dataverse        | |#]
dataverse        | 
dataverse        | [#|2021-08-20T06:02:13.069+0000|WARNING|Payara 5.2020.3|javax.enterprise.ejb.container|_ThreadID=20;_ThreadName=RunLevelControllerThread-1629439272500;_TimeMillis=1629439333069;_LevelValue=900;_MessageID=AS-EJB-00056;|
dataverse        |   A system exception occurred during an invocation on EJB TimerBean, method: public java.util.Set org.glassfish.ejb.persistent.timer.TimerBean.findActiveTimersOwnedByThisServer()|#]
dataverse        | 
dataverse        | [#|2021-08-20T06:02:13.070+0000|WARNING|Payara 5.2020.3|javax.enterprise.ejb.container|_ThreadID=20;_ThreadName=RunLevelControllerThread-1629439272500;_TimeMillis=1629439333070;_LevelValue=900;|
dataverse        |   javax.ejb.EJBException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.7.6.payara-p1): org.eclipse.persistence.exceptions.DatabaseException
dataverse        | Internal Exception: java.sql.SQLException: Error in allocating a connection. Cause: Connection could not be allocated because: The connection attempt failed.
dataverse        | Error Code: 0
dataverse        | 	at com.sun.ejb.containers.EJBContainerTransactionManager.processSystemException(EJBContainerTransactionManager.java:723)
dataverse        | 	at com.sun.ejb.containers.EJBContainerTransactionManager.completeNewTx(EJBContainerTransactionManager.java:652)
dataverse        | 	at com.sun.ejb.containers.EJBContainerTransactionManager.postInvokeTx(EJBContainerTransactionManager.java:482)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.postInvokeTx(BaseContainer.java:4562)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.postInvoke(BaseContainer.java:2111)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.postInvoke(BaseContainer.java:2081)
dataverse        | 	at com.sun.ejb.containers.EJBLocalObjectInvocationHandler.invoke(EJBLocalObjectInvocationHandler.java:220)
dataverse        | 	at com.sun.ejb.containers.EJBLocalObjectInvocationHandlerDelegate.invoke(EJBLocalObjectInvocationHandlerDelegate.java:90)
dataverse        | 	at com.sun.proxy.$Proxy487.findActiveTimersOwnedByThisServer(Unknown Source)
dataverse        | 	at org.glassfish.ejb.persistent.timer.PersistentEJBTimerService.restoreEJBTimers(PersistentEJBTimerService.java:370)
dataverse        | 	at org.glassfish.ejb.persistent.timer.PersistentEJBTimerService.resetEJBTimers(PersistentEJBTimerService.java:1401)
dataverse        | 	at com.sun.ejb.containers.EJBTimerService.initPersistentTimerService(EJBTimerService.java:436)
dataverse        | 	at com.sun.ejb.containers.EJBTimerService.getEJBTimerService(EJBTimerService.java:257)
dataverse        | 	at com.sun.ejb.containers.BaseContainer.initialize(BaseContainer.java:857)
dataverse        | 	at java.util.ArrayList.forEach(ArrayList.java:1257)
dataverse        | 	at org.glassfish.ejb.startup.EjbApplication.initialize(EjbApplication.java:250)
dataverse        | 	at org.glassfish.internal.data.EngineRef.initialize(EngineRef.java:189)
dataverse        | 	at java.lang.Iterable.forEach(Iterable.java:75)
dataverse        | 	at org.glassfish.internal.data.ApplicationInfo.lambda$initialize$0(ApplicationInfo.java:395)
dataverse        | 	at java.util.ArrayList.forEach(ArrayList.java:1257)
dataverse        | 	at org.glassfish.internal.data.ApplicationInfo.initialize(ApplicationInfo.java:395)
dataverse        | 	at com.sun.enterprise.v3.server.ApplicationLifecycle.initialize(ApplicationLifecycle.java:618)
dataverse        | 	at com.sun.enterprise.v3.server.ApplicationLoaderService.postConstruct(ApplicationLoaderService.java:332)
dataverse        | 	at org.jvnet.hk2.internal.ClazzCreator.postConstructMe(ClazzCreator.java:303)
dataverse        | 	at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:351)
dataverse        | 	at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:463)
dataverse        | 	at org.glassfish.hk2.runlevel.internal.AsyncRunLevelContext.findOrCreate(AsyncRunLevelContext.java:281)
dataverse        | 	at org.glassfish.hk2.runlevel.RunLevelContext.findOrCreate(RunLevelContext.java:65)
dataverse        | 	at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2102)
dataverse        | 	at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:93)
dataverse        | 	at org.jvnet.hk2.internal.ServiceHandleImpl.getService(ServiceHandleImpl.java:67)
dataverse        | 	at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.oneJob(CurrentTaskFuture.java:1213)
dataverse        | 	at org.glassfish.hk2.runlevel.internal.CurrentTaskFuture$QueueRunner.run(CurrentTaskFuture.java:1144)
dataverse        | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
dataverse        | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
dataverse        | 	at java.lang.Thread.run(Thread.java:748)
dataverse        | Caused by: javax.persistence.PersistenceException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.7.6.payara-p1): org.eclipse.persistence.exceptions.DatabaseException

Application is not working after following instructions.

Hi!
So after following instructions it seems that app is not working properly. After dataverse container boot the db container log says:

FATAL: role "postgres" does not exist

Which after digging seems to be related to docker-library/postgres#493 .
So I decided to try and debug the db container first, I changed POSTGRES_USER to postgres instead of dvnapp.

After accessing containers bash it seems that the role was present and I was able to access the postgres.
Unfortunately I was not able to make it work with docker-compose yet, even though the image is the same it seems that the role is not being created like it would with single container boot up.

Issue with SMTP relay

Hi,
I've setup dataverse using docker-compose.yml and update the smtp relay for outlook.com using these commands in the dataverse container:

./asadmin delete-javamail-resource mail/notifyMailSession
./asadmin create-javamail-resource --mailhost smtp-mail.outlook.com --mailuser [email protected] --fromaddress [email protected] --property mail.smtp.auth=true:mail.smtp.password=mypassword:mail.smtp.port=587:mail.smtp.socketFactory.port=587:mail.smtp.socketFactory.fallback=false:mail.smtp.socketFactory.class=javax.net.ssl.SSLSocketFactory mail/notifyMailSession

However, when I click on "forgot password" for another test user, I can see "attempted to send mail to [email protected]" in the docker logs. But I do not get any email responses in the test user's mailbox including spam folder.

can anyone help me out with this?

Thank you in advance

docker-compose-local.yml (release 5.2 )

When using the supplied docker-compose-local.yml from release 5.2, error message indicating dataverse.war is invalid. Error didn't appear when I instead used the "updated to 5.2" file from master.

Where to find images on Docker Hub

Hi! Over at nds-org/ndslabs-dataverse#8 (comment) @craig-willis seems to be suggesting that he's willing to have images pushed to https://hub.docker.com/r/ndslabs/ at least in the short term. Are the images already being pushed to some other Docker Hub organization? (I see that https://hub.docker.com/r/vtycloud has some Dataverse images.) We at @IQSS would really appreciate it if the community could push the images somewhere because we don't have the resources at this time to push them. Please let me know what you think. Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.