diff --git a/docs/streaming-custom-receivers.md b/docs/streaming-custom-receivers.md
index 732c83dc841d9..a4e17fd24eac2 100644
--- a/docs/streaming-custom-receivers.md
+++ b/docs/streaming-custom-receivers.md
@@ -256,64 +256,3 @@ The following table summarizes the characteristics of both types of receivers
-
-
- You need to extend [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver)
- so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
- this actor can be configured to handle failures, etc.
-
- class CustomActor extends ActorReceiver {
- def receive = {
- case data: String => store(data)
- }
- }
-
- // A new input stream can be created with this custom actor as
- val ssc: StreamingContext = ...
- val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
-
- See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
-
-
-
- You need to extend [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver)
- so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
- this actor can be configured to handle failures, etc.
-
- class CustomActor extends JavaActorReceiver {
- @Override
- public void onReceive(Object msg) throws Exception {
- store((String) msg);
- }
- }
-
- // A new input stream can be created with this custom actor as
- JavaStreamingContext jssc = ...;
- JavaDStream lines = AkkaUtils.createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
-
- See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
-
-
-
-3. **Deploying:** As with any Spark applications, `spark-submit` is used to launch your application.
-You need to package `spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}` and its dependencies into
-the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}`
-are marked as `provided` dependencies as those are already present in a Spark installation. Then
-use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide).
-
-