Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace remaining broken docs links in NeptuneML sample notebooks #468

Merged
merged 2 commits into from
Mar 22, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion ChangeLog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Starting with v1.31.6, this file will contain a record of major features and upd
## Upcoming
- Fixed Dockerfile builds breaking with AL2023 ([Link to PR](https://github.com/aws/graph-notebook/pull/466))
- Fixed `--store-to` option for several magics ([Link to PR](https://github.com/aws/graph-notebook/pull/463))
- Fixed broken documentation links in Neptune-ML-04-Introduction-to-Edge-Classification-Gremlin ([Link to PR](https://github.com/aws/graph-notebook/pull/467))
- Fixed broken documentation links in Neptune ML notebooks ([PR #1](https://github.com/aws/graph-notebook/pull/467)) ([PR #2](https://github.com/aws/graph-notebook/pull/468))

## Release 3.7.3 (March 14, 2023)
- Fixed detailed mode output for graph summary requests ([Link to PR](https://github.com/aws/graph-notebook/pull/461))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
"\n",
"Graphs and graph data is all about using the values and connections within that data to provide novel insight. However one common issue with graph data is that it is frequently incomplete, meaning that it contains missing property values or connections. While incomplete data is not unique to graphs the connected nature how we want to use graph data makes these gaps even more impactful, usually lead to inefficent traversals and/or incorrect results. Neptune ML was released to help mitigate these issues by integrating Machine Learning (ML) models into real time graph traversals to predect/infer missing graph elements such as properties and connections. \n",
"\n",
"[Neptune ML](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html#machine-learning-overview) is a feature of Amazon Neptune that enables users to automate the creation, management, and usage of Graph Neural Network (GNN) machine learning models within Amazon Neptune. Neptune ML is built using [Amazon SageMaker](https://aws.amazon.com/sagemaker/) and [Deep Graph Library](https://www.dgl.ai/) and provides a simple and easy to use mechanism to build/train/maintain these models and then use the predictive capabilities of these models within a Gremlin query to predict elements or property values in the graph. These models cover many common use cases such as:\n",
"[Neptune ML](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html) is a feature of Amazon Neptune that enables users to automate the creation, management, and usage of Graph Neural Network (GNN) machine learning models within Amazon Neptune. Neptune ML is built using [Amazon SageMaker](https://aws.amazon.com/sagemaker/) and [Deep Graph Library](https://www.dgl.ai/) and provides a simple and easy to use mechanism to build/train/maintain these models and then use the predictive capabilities of these models within a Gremlin query to predict elements or property values in the graph. These models cover many common use cases such as:\n",
"\n",
"* [Identifying fradulent transactions](https://aws.amazon.com/blogs/machine-learning/detecting-fraud-in-heterogeneous-networks-using-amazon-sagemaker-and-deep-graph-library/)\n",
"* Predicting group membership in a social or identity network\n",
Expand Down Expand Up @@ -95,7 +95,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"If the check above did not say that this cluster is ready to run Neptune ML jobs then please check that the cluster meets all the pre-requisites defined [here](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html#machine-learning-overview).\n",
"If the check above did not say that this cluster is ready to run Neptune ML jobs then please check that the cluster meets all the pre-requisites defined [here](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html).\n",
"\n",
"## Loading data \n",
"\n",
Expand Down Expand Up @@ -214,7 +214,7 @@
"\n",
"In your use case the models will require configuration and training parameter adjustments to maximize the accuracy of the inferences they generate. These pretrained models use the default parameters to demonstrate the base accuracy of the models prior to tuning. Additional information on how to tune these models via the links below:\n",
"\n",
"* [Training File Configuration](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-on-graphs-processing-training-config-file.html)\n",
"* [Training File Configuration](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-processing-training-config-file.html)\n",
"* [Tuning Hyperparameters](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-customizing-hyperparams.html)\n",
"* [Improving Model Performance](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-improve-model-performance.html)\n",
"\n",
Expand Down Expand Up @@ -713,4 +713,4 @@
},
"nbformat": 4,
"nbformat_minor": 4
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: This notebook take approximately 1 hour to complete</div>\n",
"\n",
"[Neptune ML](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html#machine-learning-overview) is a feature of Amazon Neptune that enables users to automate the creation, management, and usage of Graph Neural Network (GNN) machine learning models within Amazon Neptune. Neptune ML is built using [Amazon SageMaker](https://aws.amazon.com/sagemaker/) and [Deep Graph Library](https://www.dgl.ai/) and provides a simple and easy to use mechanism to build/train/maintain these models and then use the predictive capabilities of these models within a Gremlin query to predict elements or property values in the graph. \n",
"[Neptune ML](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html) is a feature of Amazon Neptune that enables users to automate the creation, management, and usage of Graph Neural Network (GNN) machine learning models within Amazon Neptune. Neptune ML is built using [Amazon SageMaker](https://aws.amazon.com/sagemaker/) and [Deep Graph Library](https://www.dgl.ai/) and provides a simple and easy to use mechanism to build/train/maintain these models and then use the predictive capabilities of these models within a Gremlin query to predict elements or property values in the graph.\n",
"\n",
"For this notebook we are going to show how to perform a common machine learning task known as **node classification**. Node classification is a common semi-supervised machine learning task where a model built using labeled nodes, ones where the property value exists, can predict the value (or class) of the nodes. Node classification is not unique to GNN based models (look at DeepWalk or node2vec) but the GNN based models in Neptune ML provide additional context to the predictions by combining the connectivity and features of the local neighborhood of a node to create a more predictive model.\n",
"\n",
Expand Down Expand Up @@ -87,7 +87,7 @@
{
"cell_type": "markdown",
"source": [
"If the check above did not say that this cluster is ready to run Neptune ML jobs then please check that the cluster meets all the pre-requisites defined [here](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html#machine-learning-overview).\n",
"If the check above did not say that this cluster is ready to run Neptune ML jobs then please check that the cluster meets all the pre-requisites defined [here](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html).\n",
"\n",
"# Load the data\n",
"The first step in building a Neptune ML model is to load data into the Neptune cluster. Loading data for Neptune ML follows the standard process of ingesting data into Amazon Neptune, for this example we'll be using the Bulk Loader. \n",
Expand Down Expand Up @@ -262,7 +262,7 @@
"\n",
"# Export the data and model configuration\n",
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: Before exporting data ensure that Neptune Export has been configured as described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export-service.html#machine-learning-data-export-service-run-export\">Neptune Export Service</a></div>"
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: Before exporting data ensure that Neptune Export has been configured as described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-manual-setup.html#ml-manual-setup-export-svc\">Neptune Export Service</a></div>"
],
"metadata": {}
},
Expand All @@ -271,7 +271,7 @@
"source": [
"With our product knowledge graph loaded we are ready to export the data and configuration which will be used to train the ML model. \n",
"\n",
"The export process is triggered by calling to the [Neptune Export service endpoint](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export-service.html). This call contains a configuration object which specifies the type of machine learning model to build, in this example node classification, as well as any feature configurations required. \n",
"The export process is triggered by calling to the [Neptune Export service endpoint](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export.html). This call contains a configuration object which specifies the type of machine learning model to build, in this example node classification, as well as any feature configurations required.\n",
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: The configuration used in this notebook specifies only a minimal set of configuration options meaning that our model's predictions are not as accurate as they could be. The parameters included in this configuration are one of a couple of sets of options available to the end user to tune the model and optimize the accuracy of the resulting predictions.</div>\n",
"\n",
Expand Down Expand Up @@ -313,7 +313,7 @@
"\n",
"In our export example below we have specified that the `title` property of our `movie` should be exported and trained as a `text_word2vec` feature and that our `age` field should range from 0-100 and that data should be bucketed into 10 distinct groups. \n",
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Important</b>: The example below is an example of a minimal amount of the features of the model configuration parameters and will not create the most accurate model possible. Additional options are available for tuning this configuration to produce an optimal model are described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-customizing-hyperparams.html\">Neptune Export Process Parameters</a></div>\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Important</b>: The example below is an example of a minimal amount of the features of the model configuration parameters and will not create the most accurate model possible. Additional options are available for tuning this configuration to produce an optimal model are described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export.html#machine-learning-params\">Neptune Export Process Parameters</a></div>\n",
"\n",
"Running the cell below we set the export configuration and run the export process. Neptune export is capable of automatically creating a clone of the cluster by setting `cloneCluster=True` which takes about 20 minutes to complete and will incur additional costs while the cloned cluster is running. Exporting from the existing cluster takes about 5 minutes but requires that the `neptune_query_timeout` parameter in the [parameter group](https://docs.aws.amazon.com/neptune/latest/userguide/parameters.html) is set to a large enough value (>72000) to prevent timeout errors."
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
"\n",
"**Note:** This notebook take approximately 1 hour to complete\n",
"\n",
"[Neptune ML](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html#machine-learning-overview) is a feature of Amazon Neptune that enables users to automate the creation, management, and usage of Graph Neural Network (GNN) machine learning models within Amazon Neptune. Neptune ML is built using [Amazon SageMaker](https://aws.amazon.com/sagemaker/) and [Deep Graph Library](https://www.dgl.ai/) and provides a simple and easy to use mechanism to build/train/maintain these models and then use the predictive capabilities of these models within a Gremlin query to predict elements or property values in the graph. \n",
"[Neptune ML](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html) is a feature of Amazon Neptune that enables users to automate the creation, management, and usage of Graph Neural Network (GNN) machine learning models within Amazon Neptune. Neptune ML is built using [Amazon SageMaker](https://aws.amazon.com/sagemaker/) and [Deep Graph Library](https://www.dgl.ai/) and provides a simple and easy to use mechanism to build/train/maintain these models and then use the predictive capabilities of these models within a Gremlin query to predict elements or property values in the graph.\n",
"\n",
"For this notebook we are going to show how to perform a common machine learning task known as **node regression**. Node regression is a common semi-supervised machine learning task where a model built using labeled nodes, ones where the property value exists, can predict the numerical value of propertues on a nodes. Node regression is not unique to GNN based models (look at DeepWalk or node2vec) but the GNN based models in Neptune ML provide additional context to the predictions by combining the connectivity and features of the local neighborhood of a node to create a more predictive model.\n",
"\n",
Expand Down Expand Up @@ -87,7 +87,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"If the check above did not say that this cluster is ready to run Neptune ML jobs then please check that the cluster meets all the pre-requisites defined [here](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html#machine-learning-overview).\n",
"If the check above did not say that this cluster is ready to run Neptune ML jobs then please check that the cluster meets all the pre-requisites defined [here](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning.html).\n",
"\n",
"# Load the data\n",
"The first step in building a Neptune ML model is to load data into the Neptune cluster. Loading data for Neptune ML follows the standard process of ingesting data into Amazon Neptune, for this example we'll be using the Bulk Loader. \n",
Expand Down Expand Up @@ -256,7 +256,7 @@
"source": [
"# Export the data and model configuration\n",
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: Before exporting data ensure that Neptune Export has been configured as described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export-service.html#machine-learning-data-export-service-run-export\">Neptune Export Service</a></div>"
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: Before exporting data ensure that Neptune Export has been configured as described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-manual-setup.html#ml-manual-setup-export-svc\">Neptune Export Service</a></div>"
]
},
{
Expand All @@ -265,7 +265,7 @@
"source": [
"With our product knowledge graph loaded we are ready to export the data and configuration which will be used to train the ML model. \n",
"\n",
"The export process is triggered by calling to the [Neptune Export service endpoint](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export-service.html). This call contains a configuration object which specifies the type of machine learning model to build, in this example node classification, as well as any feature configurations required. \n",
"The export process is triggered by calling to the [Neptune Export service endpoint](https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export.html). This call contains a configuration object which specifies the type of machine learning model to build, in this example node classification, as well as any feature configurations required.\n",
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Note</b>: The configuration used in this notebook specifies only a minimal set of configuration options meaning that our model's predictions are not as accurate as they could be. The parameters included in this configuration are one of a couple of sets of options available to the end user to tune the model and optimize the accuracy of the resulting predictions.</div>\n",
"\n",
Expand Down Expand Up @@ -309,7 +309,7 @@
"\n",
"In our export example below we have specified that the `title` property of our `movie` should be exported and trained as a `text_word2vec` feature and that our `age` field should range from 0-100 and that data should be bucketed into 10 distinct groups. \n",
"\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Important</b>: The example below is an example of a minimal amount of the features of the model configuration parameters and will not create the most accurate model possible. Additional options are available for tuning this configuration to produce an optimal model are described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export-parameters.html\">Neptune Export Process Parameters</a></div>\n",
"<div style=\"background-color:#eeeeee; padding:10px; text-align:left; border-radius:10px; margin-top:10px; margin-bottom:10px; \"><b>Important</b>: The example below is an example of a minimal amount of the features of the model configuration parameters and will not create the most accurate model possible. Additional options are available for tuning this configuration to produce an optimal model are described here: <a href=\"https://docs.aws.amazon.com/neptune/latest/userguide/machine-learning-data-export.html#machine-learning-params\">Neptune Export Process Parameters</a></div>\n",
"\n",
"Running the cell below we set the export configuration and run the export process. Neptune export is capable of automatically creating a clone of the cluster by setting `cloneCluster=True` which takes about 20 minutes to complete and will incur additional costs while the cloned cluster is running. Exporting from the existing cluster takes about 5 minutes but requires that the `neptune_query_timeout` parameter in the [parameter group](https://docs.aws.amazon.com/neptune/latest/userguide/parameters.html) is set to a large enough value (>72000) to prevent timeout errors."
]
Expand Down
Loading