Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: Cargo
directory: "/"
schedule:
interval: "daily"
68 changes: 68 additions & 0 deletions .github/workflows/binaries.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
name: Build Binaries

on:
workflow_call:
inputs:
binary_name:
description: "Expected name of binary"
required: false
type: string
default: "dls"
output_binary_name:
description: "Name of uploaded binary artifact"
type: string
default: "dls"
log-dir:
description: "Folder to put cargo logs in"
required: false
type: string
os:
description: "Machine to run on"
required: true
type: string
env:
CARGO_TERM_COLOR: always
jobs:
build:
runs-on: ${{ inputs.os }}
steps:
- uses: actions/checkout@v4
- name: versions
run: |
rustup --version
cargo --version
rustc --version
- name: Prepare log dir
run: mkdir -p ${{ inputs.log-dir }}
- name: Build
shell: bash
run: cargo build --release --verbose 2>&1 | tee -a ${{ inputs.log-dir }}/build.log
- name: Test
shell: bash
run: |
set -o pipefail
cargo test --verbose 2>&1 | tee -a ${{ inputs.log-dir }}/test.log
- name: Clippy
shell: bash
if: ${{ success() || failure() }}
run: |
set -o pipefail
rustup component add clippy
cargo clippy --version
cargo clippy --all-targets -- --deny warnings 2>&1 | tee -a ${{ inputs.log-dir }}/clippy.log
- name: Upload logs
if: ${{ success() || failure() }}
uses: actions/upload-artifact@v4
with:
name: cargo-logs
overwrite: true
path: ${{ inputs.log-dir }}
if-no-files-found: error
- name: Upload binary
if: ${{ success() || failure() }}
uses: actions/upload-artifact@v4
with:
name: ${{ inputs.output_binary_name }}
overwrite: true
path: target/release/${{ inputs.binary_name }}
if-no-files-found: error
42 changes: 42 additions & 0 deletions .github/workflows/rust.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: Archive Binary

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
schedule:
- cron: "0 0 * * *"
jobs:
build-package:
strategy:
matrix:
os:
- ubuntu-latest
- windows-latest
include:
- os: ubuntu-latest
binary: dls
- os: windows-latest
binary: dls.exe
uses: ./.github/workflows/binaries.yml
with:
binary_name: ${{ matrix.binary }}
output_binary_name: dml-server-${{ matrix.os }}
log-dir: ${{ matrix.binary }}-logs
os: ${{ matrix.os }}
check-package:
uses: ./.github/workflows/scans.yml
with:
os: ubuntu-latest
log-dir: checking-logs
merge-package:
runs-on:
- ubuntu-latest
needs: build-package
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@v4
with:
name: dml-server
pattern: dml-server-*
51 changes: 51 additions & 0 deletions .github/workflows/scans.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Cargo Check

on:
workflow_call:
inputs:
os:
description: "Machine to run on"
required: true
type: string
log-dir:
description: "Folder to put cargo logs in"
required: false
type: string

env:
CARGO_TERM_COLOR: always
jobs:
audit:
runs-on: ${{ inputs.os }}
steps:
- uses: actions/checkout@v4
- name: versions
run: |
rustup --version
cargo --version
rustc --version
- name: Prepare log dir
run: mkdir -p ${{ inputs.log-dir }}
- name: Audit Deny
shell: bash
if: ${{ success() || failure() }}
run: |
cargo install cargo-deny
cargo deny --version
set -o pipefail
cargo deny check 2>&1 | tee -a ${{ inputs.log-dir }}/deny-log
- name: Audit Outdated
shell: bash
if: ${{ success() || failure() }}
run: |
cargo install cargo-outdated
cargo outdated --version
set -o pipefail
cargo outdated --exit-code 1 2>&1 | tee -a ${{ inputs.log-dir }}/outdated-log
- name: Upload logs
if: ${{ success() || failure() }}
uses: actions/upload-artifact@v4
with:
name: cargo-scan-logs
overwrite: true
path: ${{ inputs.log-dir }}
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
<!--
© 2024 Intel Corporation
SPDX-License-Identifier: Apache-2.0 and MIT
-->
# Change Log

## 1.0.0
- Initial open-source release
151 changes: 150 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

### License

<PROJECT NAME> is licensed under the terms in [LICENSE]<link to license file in repo>. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
DML Language Server is dual-licensed under the terms in [APACHE 2.0](./LICENSE-APACHE) and [MIT](./LICENSE-MIT). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

### Sign your work

Expand Down Expand Up @@ -55,3 +55,152 @@ Use your real name (sorry, no pseudonyms or anonymous contributions.)

If you set your `user.name` and `user.email` git configs, you can sign your
commit automatically with `git commit -s`.

## How-To

This section provides information for developers who want to contribute to the
DLS or run it in a heavily customised configuration.

Testing, reporting issues, writing documentation, writing tests,
writing code, and implementing clients are all extremely valuable.

To get help with this, either leave an issue on the repo or contact the primary
developer directly.

If you want to implement DLS support in an editor, see [clients.md](clients.md).

## Building

```
git clone https://github.com/intel/dml-language-server.git
cd dml-language-server
cargo build --release
```

## Running and testing

There are three main ways to run the DLS, you can run it directly with:

```
cargo run --bin dls
```
Which starts up the DLS in a server-mode, so its of narrow applicability unless
you are planning on directly sending LSP communication into it. By passing the
'--cli' option you start up the server in command-line mode, see [CLI](#cli).

You can run the Direct-File-Analysis (DFA) binary with:
```
cargo run --bin dfa [options ...] <dls-binary> [files ...]
```
This will analyze the specified 'files' using the 'dls-binary' server binary
as-if they had been opened in a language client, this is usefull to quickly
test and debug initial file analysis and error reporting without advanced
client-interaction.

Most commonly, you'll use an IDE plugin to invoke the DLS binary for you
(see [README.md](README.md) for details).

Test the crate using `cargo test`.

Testing is unfortunately minimal. There is support for some regression tests,
but not many actual tests exists yet. There is significant work to do
before we have a comprehensive testing story.

### <a id="cli"></a>CLI

You can run DLS in the command line mode which is useful for debugging and
testing, especially to narrow down a bug to either the DLS or a client.

You need to run it with the
`--cli` flag, e.g., `cargo run --bin dls -- --cli`. This should initialize the
DLS and then give you a `>` prompt.
Look for the final message that will signal the end of the
initialization phase which will look something like:
```
{"jsonrpc":"2.0","method":"window/showMessage","params":{"message": "DML Language Server Started", "type": 3}}
```

Type `help` (or just `h`) to see the [commands available][CLI_COMMANDS]. Note
that the [positions][LSP_POSITION] in the requests and the responses are
_zero-based_ (contrary to what you'll normally see in the IDE line numbers).

[LSP_POSITION]: https://github.com/Microsoft/language-server-protocol/blob/gh-pages/specification.md#position

[CLI_COMMANDS]: dls/src/cmd.rs#L508-L542

## Implementation overview

The idea behind the DLS is to grant fast and mostly-accurate feedback for DML
development, of either specific devices or DML common-code. Due to complexities
in the DML language, and the high level of customization available when running
DMLC, this information will not be perfectly accurate or holistic at all times.
However the goal is to give a best-effort analysis that minimizes noisy
incorrect feedback and maximizes the amount of valuable info a developer can
extract, as well as providing basic standard-issue IDE functionality.

### Analysis

The DLS tracks changes to files, and keeps the changed file in memory (i.e., the
DLS does not need the IDE to save a file before providing data). These changed
files are tracked by the 'Virtual File System'.

Analysis is divided into two phases.
- The 'isolated' analysis analyses each
DML file individually, and does parsing and some post-parse processing. This
analysis perform syntactic analysis and is enough to report basic syntax errors,
- The 'device' analysis analyses from the perspective of a particular DML file with
a device declaration, pulling in information from the various isolated
analysises. This analysis performs semantic analysis, and is what
powers reference-matching, object resolution, etc.

Isolated analysis is started as soon as the user opens or changes a file
(although this can be configured in the settings to be only on save).
In addition the DLS will use information about import paths and workspaces to
try to resolve file imports, and recursively analyse any file imported from
an opened one. See the section on [include paths](#include-paths) for more
information on them. Device analysis is started as soon as it is determined that
it could be, or needs to be re-done. Commonly this is when an isolated analysis
involved in the import tree of a device-file is updated, if all files that the
device depends on have had an isolated analysis done. There are corner-case
exceptions to this (allowing for a partially correct device analysis when an
imported file is missing) but for simplicity that is not described here.

### Communicating with IDEs

The DLS communicates with IDEs via
the [Language Server protocol](https://github.com/Microsoft/language-server-protocol/blob/master/protocol.md).

The LS protocol uses JSON sent over stdin/stdout. The JSON is rather dynamic -
we can't make structs to easily map to many of the protocol objects. The client
sends commands and notifications to the DLS. Commands must get a reply,
notifications do not. Usually the structure of the reply is dictated by the
protocol spec. The DLS can also send notifications to the client. So for a long
running task (such as a build), the DLS will reply quickly to acknowledge the
request, then send a message later with the result of the task.

Associating requests with replies is done using an id which must be handled by
the DLS.

### Extensions to the Language Server Protocol

The DLS uses some custom extensions to the Language Server Protocol.
These are all sent from the DLS to an LSP client and are only used to
improve the user experience by showing progress indicators.

* `window/progress`: notification, `title: "Analysing", value: WorkDoneProgressBegin`. Sent when the first analysis starts
* ... standard LSP `publishDiagnostics`
* `window/progress`: notification, `title: "Analysing", value: WorkDoneProgressEnd`. Sent when the last analysis finishes

### <a id="include-paths"></a>Include Paths
In order to support fairly generic and complicating import configurations, the
DLS uses a context-aware methodology in order to resolve import paths.

Sidenote: be aware that due to past DMLC 1.2/1.4 interoperability functionality,
the DLS will correctly search the "./" and "./1.4/" of every path it searches.

When importing a file "A.dml" from some file "B", the DLS will search, in no
particular order:
- The folder of B
- The root folder of the workspace
- All other workspace roots
- Every include path specified by the [DML Compile commands](README.md#dml-compile-commands) under the related module
6 changes: 6 additions & 0 deletions COPYRIGHT
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
Source code is a modified version of the code available at
https://github.com/rust-lang/rls, which is licensed under Apache 2.0
and MIT.

This code is similarly dual-licensed under Apache 2.0 and MIT,
see the corresponding license files for more information.
Loading