@@ -8,15 +8,14 @@ Structure can be explicit as represented by a graph [1,2,5] or implicit as
88induced by adversarial perturbation [ 3,4] .
99
1010Structured signals are commonly used to represent relations or similarity
11- among samples that may be labeled or unlabeled. Therefore, leveraging these
12- signals during neural network training harnesses both labeled and unlabeled
13- data, which can improve model accuracy, particularly when ** the amount of labeled
14- data is relatively small** . Additionally, models trained with samples that are
15- generated by adding adversarial perturbation have been shown to be
16- ** robust against malicious attacks** , which are designed to mislead a model's
17- prediction or classification.
18-
19- NSL generalizes to Neural Graph Learning [ 1] as well as Adversarial
11+ among samples that may be labeled or unlabeled. Leveraging these signals during
12+ neural network training harnesses both labeled and unlabeled data, which can
13+ improve model accuracy, particularly when ** the amount of labeled data is
14+ relatively small** . Additionally, models trained with samples that are generated
15+ by adversarial perturbation have been shown to be ** robust against malicious
16+ attacks** , which are designed to mislead a model's prediction or classification.
17+
18+ NSL generalizes to Neural Graph Learning [ 1] as well as to Adversarial
2019Learning [ 3] . The NSL framework in TensorFlow provides the following easy-to-use
2120APIs and tools for developers to train models with structured signals:
2221
@@ -29,12 +28,11 @@ APIs and tools for developers to train models with structured signals:
2928The NSL framework is designed to be flexible and can be used to train any kind
3029of neural network. For example, feed-forward, convolution, and recurrent neural
3130networks can all be trained using the NSL framework. In addition to supervised
32- and semi-supervised learning (low amount of supervision), NSL can also be
33- generalized to unsupervised learning. Furthermore, incorporating structure
34- is done only during training; there is no change to the serving/inference
35- workflow. As a result, no additional cost (latency, memory consumption, etc)
36- because of neural structured learning is incurred during serving. Please visit
37- our tutorials for a practical introduction to NSL.
31+ and semi-supervised learning (a low amount of supervision), NSL can in theory be
32+ generalized to unsupervised learning. Incorporating structured signals is done
33+ only during training, so the performance of the serving/inference workflow
34+ remains unchanged. Please check out our tutorials for a practical introduction
35+ to NSL.
3836
3937## Getting started
4038
@@ -80,7 +78,7 @@ for tracking requests and bugs. For questions, please direct them to [Stack Over
8078[ "nsl"] ( https://stackoverflow.com/questions/tagged/nsl )
8179tag.
8280
83- ## Reference
81+ ## References
8482
8583[[ 1] T. Bui, S. Ravi and V. Ramavajjala. "Neural Graph Learning: Training Neural Networks Using Graphs." WSDM 2018] ( https://ai.google/research/pubs/pub46568.pdf )
8684
0 commit comments