Skip to content

Commit 6a2ec03

Browse files
committed
Merge remote-tracking branch 'dataverse/develop' into tc-citationdate-harvested-dataset
2 parents 7254c0c + c7b8b82 commit 6a2ec03

File tree

179 files changed

+5925
-796
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

179 files changed

+5925
-796
lines changed

conf/solr/8.11.1/schema.xml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -261,6 +261,9 @@
261261
<field name="cleaningOperations" type="text_en" multiValued="false" stored="true" indexed="true"/>
262262
<field name="collectionMode" type="text_en" multiValued="true" stored="true" indexed="true"/>
263263
<field name="collectorTraining" type="text_en" multiValued="false" stored="true" indexed="true"/>
264+
<field name="workflowType" type="text_en" multiValued="true" stored="true" indexed="true"/>
265+
<field name="workflowCodeRepository" type="text_en" multiValued="true" stored="true" indexed="true"/>
266+
<field name="workflowDocumentation" type="text_en" multiValued="true" stored="true" indexed="true"/>
264267
<field name="contributor" type="text_en" multiValued="true" stored="true" indexed="true"/>
265268
<field name="contributorName" type="text_en" multiValued="true" stored="true" indexed="true"/>
266269
<field name="contributorType" type="text_en" multiValued="true" stored="true" indexed="true"/>
@@ -498,6 +501,9 @@
498501
<copyField source="cleaningOperations" dest="_text_" maxChars="3000"/>
499502
<copyField source="collectionMode" dest="_text_" maxChars="3000"/>
500503
<copyField source="collectorTraining" dest="_text_" maxChars="3000"/>
504+
<copyField source="workflowType" dest="_text_" maxChars="3000"/>
505+
<copyField source="workflowCodeRepository" dest="_text_" maxChars="3000"/>
506+
<copyField source="workflowDocumentation" dest="_text_" maxChars="3000"/>
501507
<copyField source="contributor" dest="_text_" maxChars="3000"/>
502508
<copyField source="contributorName" dest="_text_" maxChars="3000"/>
503509
<copyField source="contributorType" dest="_text_" maxChars="3000"/>
Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
# Dataverse Software 5.11.1
2+
3+
This is a bug fix release of the Dataverse Software. The .war file for v5.11 will no longer be made available and installations should upgrade directly from v5.10.1 to v5.11.1. To do so you will need **to follow the instructions for installing release 5.11 using the v5.11.1 war file**. (Note specifically the upgrade steps 6-9 from the 5.11 release note; most importantly, the ones related to the citation block and the Solr schema). **If you had previously installed v5.11** (no longer available), follow the simplified instructions below.
4+
5+
## Release Highlights
6+
7+
Dataverse Software 5.11 contains two critical issues that are fixed in this release.
8+
9+
First, if you delete a file from a published version of a dataset that has restricted files, the file will be deleted from the file system (or S3) and lose its "owner id" in the database. For details, see Issue #8867.
10+
11+
Second, if you are a superuser, it's possible to click "Delete Draft" and delete a published dataset if it has restricted files. For details, see #8845 and #8742.
12+
13+
## Notes for Dataverse Installation Administrators
14+
15+
### Identifying Datasets with Deleted Files
16+
17+
If you have been running 5.11, check if any files show "null" for the owner id. The "owner" of a file is the parent dataset:
18+
19+
```
20+
select * from dvobject where dtype = 'DataFile' and owner_id is null;
21+
```
22+
23+
For any of these files, change the owner id to the database id of the parent dataset. In addition, the file on disk (or in S3) is likely gone. Look at the "storageidentifier" field from the query above to determine the location of the file then restore the file from backup.
24+
25+
### Identifying Datasets Superusers May Have Accidentally Destroyed
26+
27+
Check the "actionlogrecord" table for DestroyDatasetCommand. While these "destroy" entries are normal when a superuser uses the API to destroy datasets, an entry is also created if a superuser has accidentally deleted a published dataset in the web interface with the "Delete Draft" button.
28+
29+
## Complete List of Changes
30+
31+
For the complete list of code changes in this release, see the [5.11.1 Milestone](https://github.com/IQSS/dataverse/milestone/105?closed=1) in GitHub.
32+
33+
For help with upgrading, installing, or general questions please post to the [Dataverse Community Google Group](https://groups.google.com/forum/#!forum/dataverse-community) or email [email protected].
34+
35+
## Installation
36+
37+
If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.11.1/installation/). Please also contact us to get added to the [Dataverse Project Map](https://guides.dataverse.org/en/5.11.1/installation/config.html#putting-your-dataverse-installation-on-the-map-at-dataverse-org) if you have not done so already.
38+
39+
## Upgrade Instructions
40+
41+
0\. These instructions assume that you've already successfully upgraded from Dataverse Software 4.x to Dataverse Software 5 following the instructions in the [Dataverse Software 5 Release Notes](https://github.com/IQSS/dataverse/releases/tag/v5.0). After upgrading from the 4.x series to 5.0, you should progress through the other 5.x releases before attempting the upgrade to 5.11.1. **To upgrade from 5.10.1, follow the instructions for installing release 5.11 using the v5.11.1 war file**. If you had previously installed v5.11 (no longer available), follow the simplified instructions below.
42+
43+
If you are running Payara as a non-root user (and you should be!), **remember not to execute the commands below as root**. Use `sudo` to change to that user first. For example, `sudo -i -u dataverse` if `dataverse` is your dedicated application user.
44+
45+
In the following commands we assume that Payara 5 is installed in `/usr/local/payara5`. If not, adjust as needed.
46+
47+
`export PAYARA=/usr/local/payara5`
48+
49+
(or `setenv PAYARA /usr/local/payara5` if you are using a `csh`-like shell)
50+
51+
1\. Undeploy the previous version.
52+
53+
- `$PAYARA/bin/asadmin list-applications`
54+
- `$PAYARA/bin/asadmin undeploy dataverse<-version>`
55+
56+
2\. Stop Payara and remove the generated directory
57+
58+
- `service payara stop`
59+
- `rm -rf $PAYARA/glassfish/domains/domain1/generated`
60+
61+
3\. Start Payara
62+
63+
- `service payara start`
64+
65+
4\. Deploy this version.
66+
67+
- `$PAYARA/bin/asadmin deploy dataverse-5.11.1.war`
68+
69+
5\. Restart Payara
70+
71+
- `service payara stop`
72+
- `service payara start`
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
## Adding new static search facet: Metadata Types
2+
A new static search facet has been added to the search side panel. This new facet is called "Metadata Types" and is driven from metadata blocks. When a metadata field value is inserted into a dataset, an entry for the metadata block it belongs to is added to this new facet.
3+
4+
This new facet needs to be configured for it to appear on the search side panel. The configuration assigns to a dataverse what metadata blocks to show. The configuration is inherited by child dataverses.
5+
6+
To configure the new facet, use the Metadata Block Facet API: <https://guides.dataverse.org/en/latest/api/native-api.html#set-metadata-block-facet-for-a-dataverse-collection>
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
## Adding Computational Workflow Metadata
2+
The new Computational Workflow metadata block will allow depositors to effectively tag datasets as computational workflows.
3+
4+
To add the new metadata block, follow the instructions in the user guide: <https://guides.dataverse.org/en/latest/admin/metadatacustomization.html>
5+
6+
The location of the new metadata block tsv file is: `dataverse/scripts/api/data/metadatablocks/computational_workflow.tsv`
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Terms of Use is now imported when using DDI format through harvesting or the native API. (Issue #8715, PR #8743)
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Under "bug fixes":
2+
3+
Small bugs have been fixed in the dataset export in the JSON and DDI formats; eliminating the export of "undefined" as a metadata language in the former, and a duplicate keyword tag in the latter.
4+
5+
Run ReExportall to update Exports
6+
7+
Following the directions in the [Admin Guide](http://guides.dataverse.org/en/5.12/admin/metadataexport.html#batch-exports-through-the-api)
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
## New DB Settings
2+
The following DB settings have been added:
3+
- `:ShibAffiliationOrder` - Select the first or last entry in an Affiliation array
4+
- `:ShibAffiliationSeparator` (default: ";") - Set the separator for the Affiliation array
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
Tool Type Scope Description
22
Data Explorer explore file A GUI which lists the variables in a tabular data file allowing searching, charting and cross tabulation analysis. See the README.md file at https://github.com/scholarsportal/dataverse-data-explorer-v2 for the instructions on adding Data Explorer to your Dataverse.
33
Whole Tale explore dataset A platform for the creation of reproducible research packages that allows users to launch containerized interactive analysis environments based on popular tools such as Jupyter and RStudio. Using this integration, Dataverse users can launch Jupyter and RStudio environments to analyze published datasets. For more information, see the `Whole Tale User Guide <https://wholetale.readthedocs.io/en/stable/users_guide/integration.html>`_.
4-
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, and spreadsheets - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/GlobalDataverseCommunityConsortium/dataverse-previewers
4+
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, spreadsheets, and GeoJSON - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/gdcc/dataverse-previewers
55
Data Curation Tool configure file A GUI for curating data by adding labels, groups, weights and other details to assist with informed reuse. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Curation-Tool for the installation instructions.
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
["authorName", "authorAffiliation"]

doc/sphinx-guides/source/_static/api/ddi_dataset.xml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -142,6 +142,7 @@
142142
</method>
143143
<dataAccs>
144144
<notes type="DVN:TOA" level="dv">Terms of Access</notes>
145+
<notes type="DVN:TOU" level="dv">Terms of Use</notes>
145146
<setAvail>
146147
<accsPlac>Data Access Place</accsPlac>
147148
<origArch>Original Archive</origArch>

0 commit comments

Comments
 (0)