New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New Command: Sync a content type from the content type hub to a site #5551
Comments
Could we also allow syncing the content type by name? |
Also, since contenttype is the last noun we can shorten contentTypeId to id |
For the ID we need to force it to be treated as a string or bash will assume a hex digit because of the 0x prefix |
A, good idea! We'd need to get it from the hub first, but that should be possible. |
Any other comments before we open this up? @pnp/cli-for-microsoft-365-maintainers |
I've got nothing else to add. Good catch on this one! |
As mentioned before, in fact, we should (for now) always parse all options as string. Image, for example, the option |
Can I work on this? |
Usage
m365 spo contenttype sync [options]
Description
Adds a published content type from the content type hub to a site or syncs its latest changes
Options
-u, --webUrl <webUrl>
-i, --id [id]
id
orname
.-n, --name [name]
id
orname
.--listTitle [listTitle]
listTitle
,listId
orlistUrl
. Omit to sync as a site content type.--listId [listId]
listTitle
,listId
orlistUrl
. Omit to sync as a site content type.--listUrl [listUrl]
listTitle
,listId
orlistUrl
. Omit to sync as a site content type.Examples
Syncs a given published content type from the hub to the specified site.
Syncs a given published content type from the hub to the specified site and adds it to the specified list.
Response
The api response should be returned
Additional info
The
id
property should be added to the types.strings array, to force minimist to parse it as string!The following endpoint should be used:
https://learn.microsoft.com/en-us/graph/api/contenttype-addcopyfromcontenttypehub?view=graph-rest-1.0&tabs=http
The text was updated successfully, but these errors were encountered: