Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

Commit

Permalink
Make utils pip installable using setup.py
Browse files Browse the repository at this point in the history
- Added pip wheel metadata to gitignore
- Created the setup.py file exposing only the utils_nlp directory
- updated setup documentation Specifying that windows installation breaks
- add steps to use pip to install repo
- update pip install documentation to use ssh and avoid asking for name and password 2. Point to the right egg file
- match version number to August from review comment in init.py
- Add Manifest.in to source distribution
- Update documentation warning users of windows installation issues
  • Loading branch information
Emmanuel Awa authored and Emmanuel Awa committed Jul 24, 2019
1 parent 2f37f0d commit 76360ec
Show file tree
Hide file tree
Showing 5 changed files with 118 additions and 2 deletions.
1 change: 1 addition & 0 deletions .gitignore
Expand Up @@ -34,6 +34,7 @@ MANIFEST
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
pip-wheel-metadata/

# Unit test / coverage reports
htmlcov/
Expand Down
8 changes: 8 additions & 0 deletions MANIFEST.in
@@ -0,0 +1,8 @@
graft utils_nlp

global-exclude *.py[cod] __pycache__ *.so *.dylib

exclude README.md
exclude SETUP.md
exclude CONTRIBUTING.md

16 changes: 14 additions & 2 deletions SETUP.md
Expand Up @@ -25,7 +25,8 @@ Depending on the type of NLP system and the notebook that needs to be run, there

### Requirements

* A machine running Linux, MacOS or Windows.
* A machine running Linux, MacOS or Windows.
> NOTE: Windows machine are not **FULLY SUPPORTED**. Please use at your own risk.
* Miniconda or Anaconda with Python version >= 3.6.
* This is pre-installed on Azure DSVM such that one can run the following steps directly. To setup on your local machine, [Miniconda](https://docs.conda.io/en/latest/miniconda.html) is a quick way to get started.
* It is recommended to update conda to the latest version: `conda update -n base -c defaults conda`
Expand Down Expand Up @@ -69,4 +70,15 @@ We can register our created conda environment to appear as a kernel in the Jupyt
conda activate my_env_name
python -m ipykernel install --user --name my_env_name --display-name "Python (my_env_name)"

If you are using the DSVM, you can [connect to JupyterHub](https://docs.microsoft.com/en-us/azure/machine-learning/data-science-virtual-machine/dsvm-ubuntu-intro#jupyterhub-and-jupyterlab) by browsing to `https://your-vm-ip:8000`.
If you are using the DSVM, you can [connect to JupyterHub](https://docs.microsoft.com/en-us/azure/machine-learning/data-science-virtual-machine/dsvm-ubuntu-intro#jupyterhub-and-jupyterlab) by browsing to `https://your-vm-ip:8000`.

## Install this repository via PIP
A [setup.py](setup.py) file is provied in order to simplify the installation of this utilities in this repo from the main directory.

pip install -e utils_nlp

It is also possible to install directly from Github.

pip install -e git+git@github.com:microsoft/nlp.git@master#egg=utils_nlp

**NOTE** - The pip installation does not install any of the necessary package dependencies, it is expected that conda will be used as shown above to setup the environment for the utilities being used.
15 changes: 15 additions & 0 deletions __init__.py
@@ -0,0 +1,15 @@
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License.

__title__ = "Microsoft NLP"
__version__ = "2019.08"
__author__ = "NLPDev Team at Microsoft"
__license__ = "MIT"
__copyright__ = "Copyright 2018-present Microsoft Corporation"

# Synonyms
TITLE = __title__
VERSION = __version__
AUTHOR = __author__
LICENSE = __license__
COPYRIGHT = __copyright__
80 changes: 80 additions & 0 deletions setup.py
@@ -0,0 +1,80 @@
#!/usr/bin/env python
# -*- encoding: utf-8 -*-
from __future__ import absolute_import
from __future__ import print_function

import io

import re
from glob import glob
from os.path import basename, dirname, join, splitext

from setuptools import find_packages, setup

VERSION = __import__("__init__").VERSION


def read(*names, **kwargs):
with io.open(
join(dirname(__file__), *names),
encoding=kwargs.get("encoding", "utf8"),
) as fh:
return fh.read()


setup(
name="utils_nlp",
version=VERSION,
license="MIT License",
description="NLP Utility functions that are used for best practices in building state-of-the-art NLP methods and scenarios. Developed by Microsoft AI CAT",
long_description="%s\n%s"
% (
re.compile("^.. start-badges.*^.. end-badges", re.M | re.S).sub(
"", read("README.md")
),
re.sub(":[a-z]+:`~?(.*?)`", r"``\1``", read("CONTRIBUTING.md")),
),
author="AI CAT",
author_email="teamsharat@microsoft.com",
url="https://github.com/microsoft/nlp",
packages=["utils_nlp"],
include_package_data=True,
zip_safe=True,
classifiers=[
# complete classifier list: http://pypi.python.org/pypi?%3Aaction=list_classifiers
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: Unix",
"Operating System :: POSIX",
"Operating System :: Microsoft :: Windows",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"Topic :: Text Processing :: Linguistic",
"Topic :: Utilities",
"Intended Audience :: Science/Research",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"Intended Audience :: Financial and Insurance Industry",
"Intended Audience :: Healthcare Industry",
"Intended Audience :: Information Technology",
"Intended Audience :: Telecommunications Industry",
],
project_urls={
"Documentation": "https://github.com/microsoft/nlp/",
"Issue Tracker": "https://github.com/microsoft/nlp/issues",
},
keywords=[
"Microsoft NLP",
"Natural Language Processing",
"Text Processing",
"Word Embedding",
],
python_requires=">=3.6",
install_requires=[],
dependency_links=[],
extras_require={},
setup_requires=[],
)

0 comments on commit 76360ec

Please sign in to comment.