Skip to content

Commit

Permalink
Merge pull request #167 from w3c/mlagally-microscope-use-case
Browse files Browse the repository at this point in the history
Update index.html
  • Loading branch information
mlagally committed Dec 14, 2021
2 parents 9263d5d + 929b0a2 commit 4293c0b
Showing 1 changed file with 175 additions and 3 deletions.
178 changes: 175 additions & 3 deletions index.html
Expand Up @@ -878,7 +878,7 @@ <h2>Irrigation in Outdoor Environment</h2>

<ul>
<li>WoT Architecture: wireless communication in outdoor environments presents some issues: communication
consumes lots of energy, sensor nodes have limited energy, weather conditions impact communication
consumes lots of gygy, sensor nodes have limited energy, weather conditions impact communication
quality</li>
<li>WoT Thing Description: the affordance should be precise enough to describe the soil at a specific
depth or the root zone volume or the min temperature per day</li>
Expand Down Expand Up @@ -4755,8 +4755,180 @@ <h2>Health Notifiers</h2>
</dd>
</dl>
</section>
</section>



<section id="Biomedical">
<h2>Biomedical Devices</h2>
<section id="biomedical">
<h2>Digital Microscopes</h2>
<dt>Submitter(s)</dt>
<dd>
Adam Sobieski
</dd>
<dt>Category</dt>
<dd>
This use case could be horizontal, insofar as it advances digital microscopy for consumers, and could be
vertical,
insofar as it equips biomedical professionals, scientists, and educators.
</dd>
<dt>Target Users</dt>
<dd>
<ul>
<li>device owners</li>
<li>device users</li>
<li>cloud providers</li>
<li>service providers</li>
<li>device manufacturers</li>
<li>identity providers</li>
</ul>
</dd>
<dt>Motivation</dt>
<dd>
Microscopes are utilized throughout biomedicine, the sciences, and education. Advancing digital microscopes and
enabling their interoperability with mixed-reality collaborative spaces via WoT architecture and standards can
equip
biomedical professionals, scientists, and educators, amplifying and accelerating their performance and
productivity.
</dd>
<dt>Expected Devices</dt>
<dd>
Mixed-reality collaborative spaces are device agnostic. Users can collaborate while making use of AR devices, VR
devices, mobile computers, and desktop computers.
The expected devices include AR and VR equipment (e.g., head-mounted displays), computing devices, and digital
microscopes.
</dd>
<dt>Expected Data</dt>
<dd>
<p>
The expected data include 2D and 3D streams produced by digital microscopes and recordings thereof. These
streams
may contain metadata which describe the instantaneous magnifications and timescales of data.
The expected data also include the output streams produced by services. These streams could, for instance,
contain
annotation data.
</p>
<p>
With respect to annotating video streams, one could make use of secondary video tracks with
uniquely-identified
bounding boxes or more intricate silhouettes defining spatial regions on which to attach semantic data, e.g.,
metadata or annotations, using yet other secondary tracks. Similar approaches could work for point-cloud-based
and
mesh-based animations.
</p>
</dd>
<dt>Dependencies - Affected WoT deliverables and/or work items</dt>
<dd>
To be determined
</dd>
<dt>Description</dt>
<dd>
<p>
Mixed-reality collaborative spaces enable users to visualize and interact with data and to work together from
multiple locations on shared tasks and projects.
</p>
<p>
Digital microscopes could be accessed and utilized from mixed-reality collaborative spaces via WoT
architecture and
standards. Digital microscopes could be thusly utilized throughout biomedicine, the sciences, and education.
Data from digital microscopes could be processed by services to produce outputs useful to users. Users could
select
and configure one or more such services and route streaming data or recordings through them to consume
resultant
data in a mixed-reality collaborative space. Graphs, or networks, of such services could be created by users.
Services could also communicate back to digital microscopes to control their mechanisms and settings. Services
which
simultaneously process digital microscope data and communicate back to control such devices could be of use
for
providing users with automatic focusing, magnification, and tracking.
</p>
</p>
Multimodal user interfaces could be dynamically generated for digital microscope content by making use of the
output
data provided by computer-vision-related services. Such dynamic multimodal user interfaces could provide users
with
the means of pointing and using spoken natural language to indicate precisely which contents that they wish to
focus
on, magnify, or track.
</p>
<p>
For example, a digital microscope could be magnifying and streaming 2D or 3D imagery of a living animal cell.
This
data could be processed by a service which provides computer-vision-related annotations, labeling parts of the
cell:
the cell nucleus, Golgi apparatus, ribosomes, the endoplasmic reticulum, mitochondria, and so forth. The
resultant
visual content with its algorithmically-generated annotations could then be interacted with by users. Users
could
point and use spoken natural language to indicate precisely which parts of the living animal cell that they
wished
for the digital microscope to focus on, magnify, or track.
</p>
</dd>
<dt>Security Considerations</dt>
<dd>
The streaming of digital microscope data should be securable for biomedical scenarios.
Access to the controls and settings of digital microscopes should be securable for education scenarios so that
teachers can adjust the controls and students cannot.
</dd>
<dt>Privacy Considerations</dt>
<dd>
In biomedical scenarios, there are privacy issues pertaining to the use of biological samples and medical data
from
patients.
</dd>
<dt>Accessibility Considerations</dt>
<dd>
To be determined
</dd>
<dt>Internationalisation (i18n) Considerations</dt>
<dd>
Output data from services could include natural-language content or labels. Such content or labels could be
multilingual.
Dynamically generated multimodal user interfaces utilizing such content or labels could also be multilingual.
</dd>
<dt>Requirements</dt>
<dd>
<p>
Requirements that are not addressed in the current WoT standards or building blocks include streaming
protocols and
formats for 3D digital microscope data and recordings. While digital microscopes could stream video using a
variety
of existing protocols and formats, the streaming of other forms of 3D data and animations, e.g., point clouds
and
meshes, could be facilitated by recommendation.
</p>
<p>
Users could select and configure one or more services and route data streaming from digital microscopes
through them
to consume the resultant data in a mixed-reality collaborative space. Additionally, services could be designed
to
communicate back to and control the mechanisms and settings of digital microscopes.
Requirements that are not addressed in the current WoT standards or building blocks include a means of
interconnecting services. Perhaps services could utilize WoT architecture and could be described as WoT
things, or
virtual devices, which provide functionality including that with which to establish data connectivity between
them.
</p>
</dd>
<!-- <dt>Gaps</dt>
<dd>
To be determined
</dd>
<dt>Existing standards</dt>
<dd>
To be determined
</dd>
<dt>Comments</dt>
<dd>
</dd>
-->
</dl>
</section>
</section>
</section>
</section>


<section id="energy">
<h2>Energy</h2>
Expand Down Expand Up @@ -8727,4 +8899,4 @@ <h3>Domain: other</h3>
</section-->
</body>

</html>
</html>

0 comments on commit 4293c0b

Please sign in to comment.