Skip to content

Commit

Permalink
Merge pull request #10026 from bencomp/7138-identifer
Browse files Browse the repository at this point in the history
Replace "identifer" with "identifier" in docs and code
  • Loading branch information
landreev committed Nov 1, 2023
2 parents e47e016 + d76e494 commit 87c0f64
Show file tree
Hide file tree
Showing 15 changed files with 91 additions and 46 deletions.
8 changes: 4 additions & 4 deletions doc/sphinx-guides/source/developers/s3-direct-upload-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ The allowed checksum algorithms are defined by the edu.harvard.iq.dataverse.Data
curl -X POST -H "X-Dataverse-key: $API_TOKEN" "$SERVER_URL/api/datasets/:persistentId/add?persistentId=$PERSISTENT_IDENTIFIER" -F "jsonData=$JSON_DATA"
Note that this API call can be used independently of the others, e.g. supporting use cases in which the file already exists in S3/has been uploaded via some out-of-band method. Enabling out-of-band uploads is described at :ref:`file-storage` in the Configuration Guide.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifer must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifier must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.

To add multiple Uploaded Files to the Dataset
---------------------------------------------
Expand Down Expand Up @@ -147,7 +147,7 @@ The allowed checksum algorithms are defined by the edu.harvard.iq.dataverse.Data
curl -X POST -H "X-Dataverse-key: $API_TOKEN" "$SERVER_URL/api/datasets/:persistentId/addFiles?persistentId=$PERSISTENT_IDENTIFIER" -F "jsonData=$JSON_DATA"
Note that this API call can be used independently of the others, e.g. supporting use cases in which the files already exists in S3/has been uploaded via some out-of-band method. Enabling out-of-band uploads is described at :ref:`file-storage` in the Configuration Guide.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifer must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifier must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.


Replacing an existing file in the Dataset
Expand Down Expand Up @@ -177,7 +177,7 @@ Note that the API call does not validate that the file matches the hash value su
curl -X POST -H "X-Dataverse-key: $API_TOKEN" "$SERVER_URL/api/files/$FILE_IDENTIFIER/replace" -F "jsonData=$JSON_DATA"
Note that this API call can be used independently of the others, e.g. supporting use cases in which the file already exists in S3/has been uploaded via some out-of-band method. Enabling out-of-band uploads is described at :ref:`file-storage` in the Configuration Guide.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifer must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifier must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.

Replacing multiple existing files in the Dataset
------------------------------------------------
Expand Down Expand Up @@ -275,4 +275,4 @@ The JSON object returned as a response from this API call includes a "data" that
Note that this API call can be used independently of the others, e.g. supporting use cases in which the files already exists in S3/has been uploaded via some out-of-band method. Enabling out-of-band uploads is described at :ref:`file-storage` in the Configuration Guide.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifer must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.
With current S3 stores the object identifier must be in the correct bucket for the store, include the PID authority/identifier of the parent dataset, and be guaranteed unique, and the supplied storage identifier must be prefaced with the store identifier used in the Dataverse installation, as with the internally generated examples above.
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/user/dataset-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -783,7 +783,7 @@ The "Compute" button on dataset and file pages will allow you to compute on a si
Cloud Storage Access
--------------------

If you need to access a dataset in a more flexible way than the Compute button provides, then you can use the Cloud Storage Access box on the dataset page to copy the dataset's container name. This unique identifer can then be used to allow direct access to the dataset.
If you need to access a dataset in a more flexible way than the Compute button provides, then you can use the Cloud Storage Access box on the dataset page to copy the dataset's container name. This unique identifier can then be used to allow direct access to the dataset.

.. _deaccession:

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/user/find-use-data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ View Files

Files in a Dataverse installation each have their own landing page that can be reached through the search results or through the Files table on their parent dataset's page. The dataset page and file page offer much the same functionality in terms of viewing and editing files, with a few small exceptions.

- In installations that have enabled support for persistent identifers (PIDs) at the file level, the file page includes the file's DOI or handle, which can be found in the file citation and also under the Metadata tab.
- In installations that have enabled support for persistent identifiers (PIDs) at the file level, the file page includes the file's DOI or handle, which can be found in the file citation and also under the Metadata tab.
- Previewers for several common file types are available and can be added by installation administrators.
- The file page's Versions tab gives you a version history that is more focused on the individual file rather than the dataset as a whole.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -495,10 +495,24 @@ private DatasetVersion getDatasetVersionByQuery(String queryString){
}
} // end getDatasetVersionByQuery




public DatasetVersion retrieveDatasetVersionByIdentiferClause(String identifierClause, String version){
/**
* @deprecated because of a typo; use {@link #retrieveDatasetVersionByIdentifierClause(String, String) retrieveDatasetVersionByIdentifierClause} instead
* @see #retrieveDatasetVersionByIdentifierClause(String, String)
* @param identifierClause
* @param version
* @return a DatasetVersion if found, or {@code null} otherwise
*/
@Deprecated
public DatasetVersion retrieveDatasetVersionByIdentiferClause(String identifierClause, String version) {
return retrieveDatasetVersionByIdentifierClause(identifierClause, version);
}

/**
* @param identifierClause
* @param version
* @return a DatasetVersion if found, or {@code null} otherwise
*/
public DatasetVersion retrieveDatasetVersionByIdentifierClause(String identifierClause, String version) {

if (identifierClause == null){
return null;
Expand Down Expand Up @@ -620,7 +634,7 @@ public RetrieveDatasetVersionResponse retrieveDatasetVersionByPersistentId(Strin
identifierClause += " AND ds.identifier = '" + parsedId.getIdentifier() + "'";


DatasetVersion ds = retrieveDatasetVersionByIdentiferClause(identifierClause, version);
DatasetVersion ds = retrieveDatasetVersionByIdentifierClause(identifierClause, version);

if (ds != null){
msg("retrieved dataset: " + ds.getId() + " semantic: " + ds.getSemanticVersion());
Expand Down Expand Up @@ -718,7 +732,7 @@ public DatasetVersion getDatasetVersionById(Long datasetId, String version){

String identifierClause = this.getIdClause(datasetId);

DatasetVersion ds = retrieveDatasetVersionByIdentiferClause(identifierClause, version);
DatasetVersion ds = retrieveDatasetVersionByIdentifierClause(identifierClause, version);

return ds;

Expand Down
11 changes: 6 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/Shib.java
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import java.util.logging.Level;
import java.util.logging.Logger;
import jakarta.ejb.EJB;
import jakarta.ejb.EJBException;
Expand Down Expand Up @@ -62,7 +63,7 @@ public class Shib implements java.io.Serializable {
HttpServletRequest request;

private String userPersistentId;
private String internalUserIdentifer;
private String internalUserIdentifier;
AuthenticatedUserDisplayInfo displayInfo;
/**
* @todo Remove this boolean some day? Now the mockups show a popup. Should
Expand Down Expand Up @@ -210,8 +211,8 @@ public void init() {
}

String usernameAssertion = getValueFromAssertion(ShibUtil.usernameAttribute);
internalUserIdentifer = ShibUtil.generateFriendlyLookingUserIdentifer(usernameAssertion, emailAddress);
logger.fine("friendly looking identifer (backend will enforce uniqueness):" + internalUserIdentifer);
internalUserIdentifier = ShibUtil.generateFriendlyLookingUserIdentifier(usernameAssertion, emailAddress);
logger.log(Level.FINE, "friendly looking identifier (backend will enforce uniqueness): {0}", internalUserIdentifier);

String shibAffiliationAttribute = settingsService.getValueForKey(SettingsServiceBean.Key.ShibAffiliationAttribute);
String affiliation = (StringUtils.isNotBlank(shibAffiliationAttribute))
Expand Down Expand Up @@ -326,7 +327,7 @@ public String confirmAndCreateAccount() {
AuthenticatedUser au = null;
try {
au = authSvc.createAuthenticatedUser(
new UserRecordIdentifier(shibAuthProvider.getId(), lookupStringPerAuthProvider), internalUserIdentifer, displayInfo, true);
new UserRecordIdentifier(shibAuthProvider.getId(), lookupStringPerAuthProvider), internalUserIdentifier, displayInfo, true);
} catch (EJBException ex) {
/**
* @todo Show the ConstraintViolationException, if any.
Expand Down Expand Up @@ -354,7 +355,7 @@ public String confirmAndConvertAccount() {
visibleTermsOfUse = false;
ShibAuthenticationProvider shibAuthProvider = new ShibAuthenticationProvider();
String lookupStringPerAuthProvider = userPersistentId;
UserIdentifier userIdentifier = new UserIdentifier(lookupStringPerAuthProvider, internalUserIdentifer);
UserIdentifier userIdentifier = new UserIdentifier(lookupStringPerAuthProvider, internalUserIdentifier);
logger.fine("builtin username: " + builtinUsername);
AuthenticatedUser builtInUserToConvert = authSvc.canLogInAsBuiltinUser(builtinUsername, builtinPassword);
if (builtInUserToConvert != null) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ public Feed listCollectionContents(IRI iri, AuthCredentials authCredentials, Swo
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find dataverse: " + dvAlias);
}
} else {
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't determine target type or identifer from URL: " + iri);
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't determine target type or identifier from URL: " + iri);
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@ public void deleteMediaResource(String uri, AuthCredentials authCredentials, Swo
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unsupported file type found in URL: " + uri);
}
} else {
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Target or identifer not specified in URL: " + uri);
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Target or identifier not specified in URL: " + uri);
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -580,7 +580,7 @@ public boolean updateProvider( AuthenticatedUser authenticatedUser, String authe
* {@code userDisplayInfo}, a lookup entry for them based
* UserIdentifier.getLookupStringPerAuthProvider (within the supplied
* authentication provider), and internal user identifier (used for role
* assignments, etc.) based on UserIdentifier.getInternalUserIdentifer.
* assignments, etc.) based on UserIdentifier.getInternalUserIdentifier.
*
* @param userRecordId
* @param proposedAuthenticatedUserIdentifier
Expand All @@ -605,20 +605,20 @@ public AuthenticatedUser createAuthenticatedUser(UserRecordIdentifier userRecord
proposedAuthenticatedUserIdentifier = proposedAuthenticatedUserIdentifier.trim();
}
// we now select a username for the generated AuthenticatedUser, or give up
String internalUserIdentifer = proposedAuthenticatedUserIdentifier;
String internalUserIdentifier = proposedAuthenticatedUserIdentifier;
// TODO should lock table authenticated users for write here
if ( identifierExists(internalUserIdentifer) ) {
if ( identifierExists(internalUserIdentifier) ) {
if ( ! generateUniqueIdentifier ) {
return null;
}
int i=1;
String identifier = internalUserIdentifer + i;
String identifier = internalUserIdentifier + i;
while ( identifierExists(identifier) ) {
i += 1;
}
authenticatedUser.setUserIdentifier(identifier);
} else {
authenticatedUser.setUserIdentifier(internalUserIdentifer);
authenticatedUser.setUserIdentifier(internalUserIdentifier);
}
authenticatedUser = save( authenticatedUser );
// TODO should unlock table authenticated users for write here
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,18 +25,31 @@ public class UserIdentifier {
/**
* The String used in the permission system to assign roles, for example.
*/
String internalUserIdentifer;
String internalUserIdentifier;

public UserIdentifier(String lookupStringPerAuthProvider, String internalUserIdentifer) {
public UserIdentifier(String lookupStringPerAuthProvider, String internalUserIdentifier) {
this.lookupStringPerAuthProvider = lookupStringPerAuthProvider;
this.internalUserIdentifer = internalUserIdentifer;
this.internalUserIdentifier = internalUserIdentifier;
}

public String getLookupStringPerAuthProvider() {
return lookupStringPerAuthProvider;
}

/**
* @deprecated because of a typo; use {@link #getInternalUserIdentifier()} instead
* @see #getInternalUserIdentifier()
* @return the internal user identifier
*/
@Deprecated
public String getInternalUserIdentifer() {
return internalUserIdentifer;
return getInternalUserIdentifier();
}

/**
* @return the internal user identifier
*/
public String getInternalUserIdentifier() {
return internalUserIdentifier;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,24 @@ public static String findSingleValue(String mayHaveMultipleValues) {
return singleValue;
}

/**
* @deprecated because of a typo; use {@link #generateFriendlyLookingUserIdentifier(String, String)} instead
* @see #generateFriendlyLookingUserIdentifier(String, String)
* @param usernameAssertion
* @param email
* @return a friendly-looking user identifier based on the asserted username or email, or a UUID as fallback
*/
@Deprecated
public static String generateFriendlyLookingUserIdentifer(String usernameAssertion, String email) {
return generateFriendlyLookingUserIdentifier(usernameAssertion, email);
}

/**
* @param usernameAssertion
* @param email
* @return a friendly-looking user identifier based on the asserted username or email, or a UUID as fallback
*/
public static String generateFriendlyLookingUserIdentifier(String usernameAssertion, String email) {
if (usernameAssertion != null && !usernameAssertion.isEmpty()) {
return usernameAssertion;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -450,7 +450,7 @@ private void loadChecksumManifest() {
// We probably want package files to be able to use specific stores instead.
// More importantly perhaps, the approach above does not take into account
// if the dataset may have an AlternativePersistentIdentifier, that may be
// designated isStorageLocationDesignator() - i.e., if a different identifer
// designated isStorageLocationDesignator() - i.e., if a different identifier
// needs to be used to name the storage directory, instead of the main/current
// persistent identifier above.
getJobLogger().log(Level.INFO, "Reading checksum manifest: " + manifestAbsolutePath);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ public void open(Serializable checkpoint) throws Exception {
// We probably want package files to be able to use specific stores instead.
// More importantly perhaps, the approach above does not take into account
// if the dataset may have an AlternativePersistentIdentifier, that may be
// designated isStorageLocationDesignator() - i.e., if a different identifer
// designated isStorageLocationDesignator() - i.e., if a different identifier
// needs to be used to name the storage directory, instead of the main/current
// persistent identifier above.
getJobLogger().log(Level.INFO, "Reading dataset directory: " + directory.getAbsolutePath()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ public JsonObject execute(CommandContext ctxt) throws CommandException {
// We probably want package files to be able to use specific stores instead.
// More importantly perhaps, the approach above does not take into account
// if the dataset may have an AlternativePersistentIdentifier, that may be
// designated isStorageLocationDesignator() - i.e., if a different identifer
// designated isStorageLocationDesignator() - i.e., if a different identifier
// needs to be used to name the storage directory, instead of the main/current
// persistent identifier above.
if (!isValidDirectory(directory)) {
Expand Down

0 comments on commit 87c0f64

Please sign in to comment.