Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Discussion] Future plans for dapr api #2817

Open
nobodyiam opened this issue Feb 18, 2021 · 29 comments
Open

[Discussion] Future plans for dapr api #2817

nobodyiam opened this issue Feb 18, 2021 · 29 comments

Comments

@nobodyiam
Copy link

Firstly, I'd like to congratulate dapr community on the 1.0.0 release, this is a remarkable milestone!

I understand the community was busy with the 1.0.0 release recently, but right now might be a good time to take a cup of coffee and think about the future.🙂

What I'm interested in and like to discuss here is our future plans for dapr api: Do we have the intention to make dapr api a standard so that other sidecars/proxies could implement?

I've been playing with the dapr demos recently, and one thing that I'm really fond of is its possibility to truly realize Write once, Run anywhere, as dapr defines a very good abstraction layer between application code and the backend services. If every application could be sided with a dapr process/sidecar, then this would easily happen.

image

However, considering the real world cases: where we have different cloud providers, different legacy components and many other factors I could not possibly list, it would be very hard to make dapr available in every situation.

So I'm thinking why not we take a step forward and make the abstraction layer a standard? So that other cloud providers could choose to offer hosted dapr solution or their own solutions with the same api, and legacy components could also bridge to the same api. Then as an end user, we could safely program our application against the standard api and ship it to different environments with no code change.

image

I guess someone might raise the hand and say this sounds good but it looks like we don't need to do anything as the apis are already defined in the proto files. But what I'm thinking is if we do have the intention to make dapr api a standard, then it would be better to send more obvious signals(e.g. move proto files to a separate repo) or state our intention more clearly so that other sidecars/proxies could implement the dapr apis without concern, e.g. concerns of breaking changes, etc.

Looking forward to your ideas/comments on this topic!

BTW, I noticed there was a proposal(#816) to move dapr & daprclient proto files to a separate repo but it was closed with no comment...

@yaron2
Copy link
Member

yaron2 commented Feb 18, 2021

Hey @nobodyiam thanks for bringing this up. Speaking for myself, I think it makes sense to make the (some?) Dapr APIs a standard as it'll enable interoperability between other technologies.

I guess someone might raise the hand and say this sounds good but it looks like we don't need to do anything as the apis are already defined in the proto files

The proto files are not enough, and Dapr also has an HTTP 1.1 API that needs to be taken into consideration here. I agree that if we were to try and formulate a specification out of the Dapr APIs, we'd need more than proto/swagger descriptions.

@msfussell
Copy link
Member

@nobodyiam - I agree with you that this is a direction to go and it will be interesting if there is a desire to separate out the APIs. It is certainly a direction to keep in mind as Dapr is further adopted. Would like to hear other people's thoughts here.

@artursouza
Copy link
Member

There is an opportunity for Dapr API to mature and become a spec. I believe we will need at least another version of the Dapr API to feel more confident about extracting a spec out of it.

@yaron2
Copy link
Member

yaron2 commented Feb 18, 2021

On the flip side, iterating a spec might lead to some changes in the API.

@nobodyiam
Copy link
Author

Thanks for your feedback!

Considering the current adoption rate of http/2, I agree we still need to support the http/1.1 api.
However, I think the api specification for http/2 might be a little different from the one for http/1.1. Because http/2 does have some useful features like streaming, push which can improve the experience for situations like subscribing messages or configuration updates.

I believe we will need at least another version of the Dapr API to feel more confident about extracting a spec out of it.

On the flip side, iterating a spec might lead to some changes in the API.

Right, I also believe it may take some time or even introduce some breaking changes before we finalize the spec. However, I'm wondering is there anything we could do now to help us get closer to the final goal? Please feel free to share your thoughts or ideas, I'll try my best to help.

@artursouza
Copy link
Member

I believe we need to have a schema for the Dapr API first. Right now it is in proto files and hard coded in the http handlers.

@xiazuojie
Copy link

I believe the proto files already make good candidates as API spec. The problem is to support HTTP/1.1 as well.

I think we should give up HTTP/1.1 support so that the API can embrace modern features like streaming, multiplexing.

@withinboredom
Copy link
Contributor

withinboredom commented Feb 23, 2021

I think we should give up HTTP/1.1 support

I'm assuming when we say HTTP/1.1, we mean the REST API, and when we say HTTP/2, we mean the GRPC API? It may be worth clarifying that GRPC is tied to HTTP/2 while the REST API can operate with either protocol, but IIRC, Dapr doesn't support the REST API over HTTP/2.

IMHO, I'm not sure giving up the REST API is a good idea. REST is especially useful when dealing with legacy systems and/or languages that aren't (yet) supported by GRPC.

so that the API can embrace modern features like streaming, multiplexing.

FWIW, HTTP/1.1 allows streaming (via chunked encoding and range headers), and the benefits of multiplexing can be emulated using connection pools. If the REST API were supported on HTTP/2, there's no reason the REST API couldn't also return a stream (using chunked encoding on 1.1 and native streams on 2).

Edit: it's also worth pointing out that streams aren't very useful for a stateless/shared-nothing language and make the most sense in a language with a shared global state.

@skyao
Copy link
Member

skyao commented Feb 24, 2021

I agree to give up http1.1/REST support and focus on http2/gRPC.

The most important reason is that we have sdk for all most all langurage which our end user use to connect to dapr. If some languraage sdk is absent, the best way is to add it, not let the end user access to the dapr port directly.

Then with sdk, why we don't use grpc but choose REST?

@artursouza
Copy link
Member

artursouza commented Feb 24, 2021 via email

@yaron2
Copy link
Member

yaron2 commented Feb 24, 2021

I don't think we can give up on HTTP1.1/REST because:

  1. It's still widely used in the industry and will continue to be for years to come.
  2. gRPC support doesn't exist for all languages, and some of the existing implementations are still experimental.

As such, dropping HTTP1.1/REST support will undermine the language agnostic/developer inclusive goals of the project.

But...

As @artursouza alluded to, an implementation/SDK does not have to implement both gRPC/HTTP1.1 APIs to be a valid implementation.

I'll even go as far as to say that an implementation of the API does not need to implement the entire API surface.

For example:

The GCP PubSub/Alicloud Message Queues managed offerings might decide to support Dapr's Pub/Sub API, while NoSQL managed offerings might support the Dapr State APIs.

@nobodyiam
Copy link
Author

As @artursouza alluded to, an implementation/SDK does not have to implement both gRPC/HTTP1.1 APIs to be a valid implementation.

I'll even go as far as to say that an implementation of the API does not need to implement the entire API surface.

+1, but I'm not sure whether the spec can be independent of protocol.
If we want to truly realize Write once, Run anywhere, then the protocol, codec, service and data definition might all be part of the api spec, or the application code still needs to understand the differences between the sidecars/proxies it talks to.

@artursouza
Copy link
Member

artursouza commented Feb 25, 2021

My point is that for each protocol, there would be a "child spec" (similar to how CloudEvent does). But I agree that the implementation should be complete and not partial.

So, one sidecar might implement Dapr over HTTP 1.1 but not HTTP 2, for example. So, I think this is something that implementations should advertise, so applications know which ones would work for them.

Additionally, the SDKs can make this transparent by "auto-detecting" which protocol is available. The Java SDK, for example, has implementation for both gRPC and HTTP 1.1 behind the scenes.

@jrjenks-davinci
Copy link

If you're going to wrap the publish and subscribe building block, maybe you should look at AsyncAPI instead of OpenAPI.

@wcs1only
Copy link
Contributor

wcs1only commented Mar 9, 2021

One possible implementation of an HTTP API spec:
OpenAPI. I hacked up a quick spec for our statestore API https://raw.githubusercontent.com/wcs1only/dapr/openapi-spec/swagger/openapi.yaml

@wcs1only
Copy link
Contributor

wcs1only commented Mar 9, 2021

I highlighted this issue in the Dapr community call today:
https://youtu.be/0SX_1tpeHdI?t=1000

Summary:
We already have a spec for our gRPC API via the .proto files in the Dapr repo. The HTTP API has no published spec (other than our documentation). One possible way to deliver a spec would be to create an OpenAPI spec for our existing v1 HTTP/1.1 API.

I demoed an OpenAPI spec for our state store API: https://raw.githubusercontent.com/wcs1only/dapr/openapi-spec/swagger/openapi.yaml

Some use cases/questions that came up on the community call:
Q: What problems does OpenAPI for Dapr solve:
A:

  • Allows us to build clients for languages we don't currently have (eg: Lua)
  • Allows us to consistently support other protocols (eg: HTTP/2 w/o grpc, CoAP, etc.)
  • Would allow us to autogenerate documentation from the spec and keep them in sync with implementations

Q: What about AsyncAPI instead of OpenAPI
A: We can look into that. Here's what we're looking for in a tool:

  • Wide support for languages and frameworks
  • We'll need to be able to fully represent our existing v1 API with no changes
  • Fully open platform

Nice to haves:

  • Integrated documentation
  • Mature tooling

Assuming AsyncAPI does these things and does them as well/better than OpenAPI, we'll certainly consider it. (In subsequent conversations with @jrjenks-davinci, I think they had something a little different in mind when they asked this question. Namely: building Daperized applications from application specific AsyncAPI specs)

@PlusMinus0
Copy link

AsyncAPI is heavily inspired by OpenAPI and as of March 31 also part of the Linux Foundation.

According to AsyncAPI, they are compatible with the OpenAPI spec, with a few additions.

Although there currently aren't as many frameworks or tools available for AsyncAPI, as there are for OpenAPI, it makes sense in my opinion to invest into AsyncAPI simply because OpenAPI does not support pub/sub patterns and AsyncAPI does. Then again, because of AsyncAPI being compatible with OpenAPI it might make sense, to start with OpenAPI support, with AsyncAPI support in mind for future releases.

Currently, our workflow is to extract json-schema from the spec and validate the messages against it. For us, it would be of immense help, if dapr could read the spec and validate the messages in the sidecar.

@dapr-bot
Copy link
Collaborator

dapr-bot commented Jul 3, 2021

This issue has been automatically marked as stale because it has not had activity in the last 30 days. It will be closed in the next 7 days unless it is tagged (pinned, good first issue, help wanted or triaged/resolved) or other activity occurs. Thank you for your contributions.

@dapr-bot dapr-bot added stale Issues and PRs without response and removed stale Issues and PRs without response labels Jul 3, 2021
@geffzhang
Copy link

That's a good question, I'll do it, don't close it

@kevinten10
Copy link

Hi team, I want to know...

What is the current progress of the standard API definition?

We explored similar ideas in #mosn/layotto#188 :

Hi, let me talk about my thoughts and hope to share:

Although several projects (dapr/layotto/capa) all claim to use a unified API definition, they actually play their own, such as dapr-proto and layotto-proto are independent implementations ( Although it is written the same).

If for the goal of "a user can write once and run everywhere" using an SDK, then we may need to add an abstraction layer on top of it, just like the abstraction layer we do on middleware. Block the differences between dapr/layotto/capa items.

For example, referring to the The Reactive Manifesto, we can jointly launch the "xxxx Manifesto" with dapr, and then define a unified API in this declaration through protobuf. These APIs It has nothing to do with specific projects such as dapr/layotto/capa.
Through this protobuf, a unified API in different languages ​​is generated. For example, reactive-streams-jvm is the implementation of the interface defined by the reactive declaration on java .

After that, the user only needs to program for this library when coding, and then adapt it to the client implemented by dapr, layotto or capa through different adapters.
For example, when I was writing serverless code, I was coding by spring-cloud-funtion, so that I can have a unified coding implementation on different cloud platforms, and then introduce different spring-cloud-function-adapter, adapt to aws, gcp or other serverless platform.

ps: capa (cloud-application-api) is the api for sdk mentioned above.

@seeflood
Copy link
Member

seeflood commented Nov 29, 2021

Hi guys ,I read an interesting interview with @yaron2 about making Dapr API a new standard :)
I'm very curious how it is going , especially:

  1. According to discussion above, our next plan is to write a protocal-neutral spec for current Dapr API instead of extracting Dapr API into another repo,for example, open xxx spec. Am I right?
  2. We will not modify the existing api in the process of writing the spec. Am I right?
  3. What is the current progress, do we have any time plan?
  4. How should we interested developers get involved? Shall we discuss it in the dapr community call?

@yaron2
Copy link
Member

yaron2 commented Nov 30, 2021

Hi guys ,I read an interesting interview with @yaron2 about making Dapr API a new standard :) I'm very curious how it is going , especially:

  1. According to discussion above, our next plan is to write a protocal-neutral spec for current Dapr API instead of extracting Dapr API into another repo,for example, open xxx spec. Am I right?
  2. We will not modify the existing api in the process of writing the spec. Am I right?
  3. What is the current progress, do we have any time plan?
  4. How should we interested developers get involved? Shall we discuss it in the dapr community call?

Hey @seeflood, this is still early days but at least what I'm envisioning is starting with the current Dapr API as the basis of the spec, which will be either extracted into a separate repository in the dapr org or be a made a completely new org. I kind of favor the former.

Since most of Dapr's APIs are 1.0, I imagine existing APIs will not change in the process, and any changes will be the basis of future versions of the existing APIs.
There's no timeline for this yet, and as a steering committee member I actually think the API spec could be the first case for a Dapr SIG (special interest group), which will be led by interested parties/developers.

@seeflood
Copy link
Member

seeflood commented Dec 1, 2021

@yaron2 Thanks for your reply! I am very interested in this Dapr SIG and willing to join it. Setting standards for the future is very cool.

Another detailed question :
Currently there is the word 'dapr' in grpc API ,for example, dapr.proto,the package name dapr.proto.runtime.v1, the service name Dapr. Shall we consider a new name like CloudEvent , OpenTracing and remove the word dapr in it ,so that it sounds like a product-independent industry standard?
For example ,change the proto package name to open.proto.runtime.v1 in Dapr API v2.0 .

I really want to know your opinions on it, because I'm considering letting the layotto project (a sidecar used in alipay) support dapr api(including the same package name and service name defined in dapr.proto) , or wait to avoid rework and see if dapr is considering changing it.

@kevinten10
Copy link

Hi, as a community developer, I think the standards is cool.

I mainly consider two characteristics:

  1. The standard api and dapr are not strongly bound, we hope to use an industry standard open api.
  2. Able to support custom expansion api. For Example, I can fork the standard api and plug and unplug it. This may require a microkernel design, rather than putting it in a large and comprehensive file.

If the community plans to define such an api, I am very interested in participating in it. At the same time, I will migrate the api of cloud-runtimes-jvm to the standard api.

@yaron2
Copy link
Member

yaron2 commented Dec 13, 2021

Hi, as a community developer, I think the standards is cool.

I mainly consider two characteristics:

  1. The standard api and dapr are not strongly bound, we hope to use an industry standard open api.
  2. Able to support custom expansion api. For Example, I can fork the standard api and plug and unplug it. This may require a microkernel design, rather than putting it in a large and comprehensive file.

If the community plans to define such an api, I am very interested in participating in it. At the same time, I will migrate the api of cloud-runtimes-jvm to the standard api.

I agree that the API and Dapr should not be strongly bound in that the APIs can develop independently from the Go implementation of Dapr.

Supporting custom APIs is also something I think would benefit the community a lot, and this needs some thought inside dapr/dapr as for how this works with middleware, tracing, metrics and security. I don't see an issue there, but at the very least, this can be extremely beneficial for developers to be able to plug their own APIs and leverage all the underlying benefits that Dapr provides out of the box.

@yaron2
Copy link
Member

yaron2 commented Dec 13, 2021

@seeflood and @kevinten10 I will bring this up to the Dapr steering committee on the next meeting and request to open a SIG for this. I'm including you both unless you tell me otherwise.

@seeflood
Copy link
Member

@yaron2 Thanks! I'm in .

@kevinten10
Copy link

@yaron2 Thanks! All right!

@yaron2
Copy link
Member

yaron2 commented Jan 25, 2022

@seeflood @kevinten10 the API spec SIG is now formally established with you both as co-chairs. Congrats :)

We've created a dedicated repository (which you're both now maintainers of, look for a GitHub invite in your email) and a Discord Channel.

All work, designs and discussions should now occur in that repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests