Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PatchOptions.meta.k8s.io \"\" is invalid: fieldManager: Required value: is required for apply patch #1036

Closed
shachardevops opened this issue Nov 30, 2021 · 12 comments
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed.

Comments

@shachardevops
Copy link

Trying to work with apply function but I am facing a couple of problems.

func ApplyConfigmap(key, name, namespace string, data []byte, immutable bool, opts metav1.ApplyOptions) (*v1.ConfigMap, error) {
	clientset := CreateK8sClient(config.KubeConfigPath, true)
	config := corev1.ConfigMap(name, namespace)
	config.WithBinaryData(map[string][]byte{key: data})
	cm, err := clientset.CoreV1().ConfigMaps(namespace).Apply(context.TODO(), config, opts)
	if err != nil {
		return nil, err
	}
	return cm, nil
}

returns an error

PatchOptions.meta.k8s.io \"\" is invalid: fieldManager: Required value: is required for apply patch

the function that fails is

func (r *Request) Do(ctx context.Context) Result {
	var result Result
	err := r.request(ctx, func(req *http.Request, resp *http.Response) {
		result = r.transformResponse(resp, req)
	})
	if err != nil {
		return Result{err: err}
	}
	return result
}
@shachardevops
Copy link
Author

what should i do to fix it

@weiyi1125
Copy link

FieldManager is required for apply requests , you can try this , metav1.ApplyOptions{FieldManager: "application/apply-patch"}

@k8s-triage-robot
Copy link

The Kubernetes project currently lacks enough contributors to adequately respond to all issues and PRs.

This bot triages issues and PRs according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue or PR as fresh with /remove-lifecycle stale
  • Mark this issue or PR as rotten with /lifecycle rotten
  • Close this issue or PR with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

@k8s-ci-robot k8s-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Mar 9, 2022
@k8s-triage-robot
Copy link

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.

This bot triages issues and PRs according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Mark this issue or PR as fresh with /remove-lifecycle rotten
  • Close this issue or PR with /close
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle rotten

@k8s-ci-robot k8s-ci-robot added lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. and removed lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. labels Apr 8, 2022
@k8s-triage-robot
Copy link

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.

This bot triages issues and PRs according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Reopen this issue or PR with /reopen
  • Mark this issue or PR as fresh with /remove-lifecycle rotten
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/close

@k8s-ci-robot
Copy link
Contributor

@k8s-triage-robot: Closing this issue.

In response to this:

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.

This bot triages issues and PRs according to the following rules:

  • After 90d of inactivity, lifecycle/stale is applied
  • After 30d of inactivity since lifecycle/stale was applied, lifecycle/rotten is applied
  • After 30d of inactivity since lifecycle/rotten was applied, the issue is closed

You can:

  • Reopen this issue or PR with /reopen
  • Mark this issue or PR as fresh with /remove-lifecycle rotten
  • Offer to help out with Issue Triage

Please send feedback to sig-contributor-experience at kubernetes/community.

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@2291896153
Copy link

我该怎么做才能解决它

what should i do to fix it

do you have a solution now

@jessequinn
Copy link

FieldManager: "application/apply-patch"

the following is the solution @2291896153 it worked for me

ariefrahmansyah added a commit to caraml-dev/merlin that referenced this issue Jul 10, 2023
<!--  Thanks for sending a pull request!  Here are some tips for you:

1. Run unit tests and ensure that they are passing
2. If your change introduces any API changes, make sure to update the
e2e tests
3. Make sure documentation is updated for your PR!

-->

**What this PR does / why we need it**:
<!-- Explain here the context and why you're making the change. What is
the problem you're trying to solve. --->

1. Fix PDB apply by supplying `FieldManager: "application/apply-patch"`
kubernetes/client-go#1036 (comment)
2. Fix PDB config to always have % as suffix

**Which issue(s) this PR fixes**:
<!--
*Automatically closes linked issue when PR is merged.
Usage: `Fixes #<issue number>`, or `Fixes (paste link of issue)`.
-->

Fixes PDB implementation

**Does this PR introduce a user-facing change?**:
<!--
If no, just write "NONE" in the release-note block below.
If yes, a release note is required. Enter your extended release note in
the block below.
If the PR requires additional action from users switching to the new
release, include the string "action required".

For more information about release notes, see kubernetes' guide here:
http://git.k8s.io/community/contributors/guide/release-notes.md
-->

```release-note
NONE
```

**Checklist**

- [x] Added unit test, integration, and/or e2e tests
- [x] Tested locally
- [ ] Updated documentation
- [ ] Update Swagger spec if the PR introduce API changes
- [ ] Regenerated Golang and Python client if the PR introduce API
changes
@qiref
Copy link

qiref commented Sep 12, 2023

When I use client like this :

client.AppsV1().DaemonSets(NameSpace).Patch(Name, types.ApplyPatchType, []byte(patchContent))

Got the same error

 PatchOptions.meta.k8s.io "" is invalid: fieldManager: Required value: is required for apply patch

But there is no metav1.ApplyOptions param in method Patch(name string, pt types.PatchType, data []byte, subresources ...string) (result *v1.DaemonSet, err error)

@weiyi1125

@liggitt
Copy link
Member

liggitt commented Sep 12, 2023

what version of the client are you using? in all recent versions, the signature looks like this:

Patch(ctx context.Context, name string, pt types.PatchType, data []byte, opts metav1.PatchOptions, subresources ...string) (result *v1.DaemonSet, err error)

@qiref
Copy link

qiref commented Sep 12, 2023

v0.0.0-20191016111102-bec269661e48

@liggitt
Copy link
Member

liggitt commented Sep 12, 2023

yeah, updating to a version like v0.28.1 is recommended

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed.
Projects
None yet
Development

No branches or pull requests

8 participants