Original author(s) | Alex D. Black, Adam Gibson, Vyacheslav Kokorin, Josh Patterson |
---|---|
Developer(s) | Kondiut K. K. and contributors |
Preview release | 1.0.0-beta7
/ 13 May 2020[1] |
Repository | |
Written in | Java, CUDA, C, C++, |
Operating system | Linux, macOS, Windows, Android, iOS |
Platform | CUDA, x86, ARM, PowerPC |
Available in | English |
Type | Natural language processing, deep learning, machine vision, artificial intelligence |
License | Apache License 2.0 |
Website | www |
Part of a series on |
Machine learning and data mining |
---|
Eclipse Deeplearning4j is a programming library written in Java for the Java virtual machine (JVM).[2][3] It is a framework with wide support for deep learning algorithms.[4] Deeplearning4j includes implementations of the restricted Boltzmann machine, deep belief net, deep autoencoder, stacked denoising autoencoder and recursive neural tensor network, word2vec, doc2vec, and GloVe. These algorithms all include distributed parallel versions that integrate with Apache Hadoop and Spark.[5]
Deeplearning4j is open-source software released under Apache License 2.0,[6] developed mainly by a machine learning group headquartered in San Francisco.[7] It is supported commercially by the startup Skymind, which bundles DL4J, TensorFlow, Keras and other deep learning libraries in an enterprise distribution called the Skymind Intelligence Layer.[8] Deeplearning4j was contributed to the Eclipse Foundation in October 2017.[9][10]
Introduction
Deeplearning4j relies on the widely used programming language Java, though it is compatible with Clojure and includes a Scala application programming interface (API). It is powered by its own open-source numerical computing library, ND4J, and works with both central processing units (CPUs) and graphics processing units (GPUs).[11][12]
Deeplearning4j has been used in several commercial and academic applications. The code is hosted on GitHub.[13] A support forum is maintained on Gitter.[14]
The framework is composable, meaning shallow neural nets such as restricted Boltzmann machines, convolutional nets, autoencoders, and recurrent nets can be added to one another to create deep nets of varying types. It also has extensive visualization tools,[15] and a computation graph.[16]
Distributed
Training with Deeplearning4j occurs in a cluster. Neural nets are trained in parallel via iterative reduce, which works on Hadoop-YARN and on Spark.[7][17] Deeplearning4j also integrates with CUDA kernels to conduct pure GPU operations, and works with distributed GPUs.
Scientific computing for the JVM
Deeplearning4j includes an n-dimensional array class using ND4J that allows scientific computing in Java and Scala, similar to the functions that NumPy provides to Python. It's effectively based on a library for linear algebra and matrix manipulation in a production environment.
DataVec vectorization library for machine-learning
DataVec vectorizes various file formats and data types using an input/output format system similar to Hadoop's use of MapReduce; that is, it turns various data types into columns of scalars termed vectors. DataVec is designed to vectorize CSVs, images, sound, text, video, and time series.[18][19]
Text and NLP
Deeplearning4j includes a vector space modeling and topic modeling toolkit, implemented in Java and integrating with parallel GPUs for performance. It is designed to handle large text sets.
Deeplearning4j includes implementations of term frequency–inverse document frequency (tf–idf), deep learning, and Mikolov's word2vec algorithm,[20] doc2vec, and GloVe, reimplemented and optimized in Java. It relies on t-distributed stochastic neighbor embedding (t-SNE) for word-cloud visualizations.
Real-world use cases and integrations
Real-world use cases for Deeplearning4j include network intrusion detection and cybersecurity, fraud detection for the financial sector,[21][22] anomaly detection in industries such as manufacturing, recommender systems in e-commerce and advertising,[23] and image recognition.[24] Deeplearning4j has integrated with other machine-learning platforms such as RapidMiner, Prediction.io,[25] and Weka.[26]
Machine Learning Model Server
Deeplearning4j serves machine-learning models for inference in production using the free developer edition of SKIL, the Skymind Intelligence Layer.[27][28] A model server serves the parametric machine-learning models that makes decisions about data. It is used for the inference stage of a machine-learning workflow, after data pipelines and model training. A model server is the tool that allows data science research to be deployed in a real-world production environment.
What a Web server is to the Internet, a model server is to AI. Where a Web server receives an HTTP request and returns data about a Web site, a model server receives data, and returns a decision or prediction about that data: e.g. sent an image, a model server might return a label for that image, identifying faces or animals in photographs.
The SKIL model server is able to import models from Python frameworks such as Tensorflow, Keras, Theano and CNTK, overcoming a major barrier in deploying deep learning models.
Benchmarks
Deeplearning4j is as fast as Caffe for non-trivial image recognition tasks using multiple GPUs.[29] For programmers unfamiliar with HPC on the JVM, there are several parameters that must be adjusted to optimize neural network training time. These include setting the heap space, the garbage collection algorithm, employing off-heap memory and pre-saving data (pickling) for faster ETL.[30] Together, these optimizations can lead to a 10x acceleration in performance with Deeplearning4j.
API Languages: Java, Scala, Python , Clojure & Kotlin
Deeplearning4j can be used via multiple API languages including Java, Scala, Python, Clojure and Kotlin. Its Scala API is called ScalNet.[31] Keras serves as its Python API.[32] And its Clojure wrapper is known as DL4CLJ.[33] The core languages performing the large-scale mathematical operations necessary for deep learning are C, C++ and CUDA C.
Tensorflow, Keras & Deeplearning4j
Tensorflow, Keras and Deeplearning4j work together. Deeplearning4j can import models from Tensorflow and other Python frameworks if they have been created with Keras.[34]
See also
References
- ↑ "Releases · eclipse/deeplearning4j". github.com. Retrieved 2021-04-03.
- ↑ Metz, Cade (2014-06-02). "The Mission to Bring Google's AI to the Rest of the World". Wired.com. Retrieved 2014-06-28.
- ↑ Vance, Ashlee (2014-06-03). "Deep Learning for (Some of) the People". Bloomberg Businessweek. Archived from the original on June 4, 2014. Retrieved 2014-06-28.
- ↑ Novet, Jordan (2015-11-14). "Want an open-source deep learning framework? Take your pick". VentureBeat. Retrieved 2015-11-24.
- ↑ "Adam Gibson, DeepLearning4j on Spark and Data Science on JVM with nd4j, SF Spark @Galvanize 20150212". SF Spark Meetup. 2015-02-12. Retrieved 2015-03-01.
- ↑ "Github Repository". GitHub. April 2020.
- 1 2 "deeplearning4j.org".
- ↑ "Skymind Intelligence Layer Community Edition". Archived from the original on 2017-11-07. Retrieved 2017-11-02.
- ↑ "Eclipse Deeplearning4j Project Page". 22 June 2017.
- ↑ "Skymind's Deeplearning4j, the Eclipse Foundation, and scientific computing in the JVM". Jaxenter. 13 November 2017. Retrieved 2017-11-15.
- ↑ Harris, Derrick (2014-06-02). "A startup called Skymind launches, pushing open source deep learning". GigaOM.com. Archived from the original on 2014-06-28. Retrieved 2014-06-29.
- ↑ Novet, Jordan (2014-06-02). "Skymind launches with open-source, plug-and-play deep learning features for your app". Retrieved 2014-06-29.
- ↑ "deeplearning4j/deeplearning4j". 29 April 2023. Retrieved 29 April 2023 – via GitHub.
- ↑ "Element". app.gitter.im. Retrieved 29 April 2023.
- ↑ "Deeplearning4j Visualization Tools". Archived from the original on 2017-08-10. Retrieved 2016-08-17.
- ↑ "Deeplearning4j Computation Graph". Archived from the original on 2017-08-10. Retrieved 2016-08-17.
- ↑ "Iterative reduce". GitHub. 15 March 2020.
- ↑ "DataVec ETL for Machine Learning". Archived from the original on 2017-10-02. Retrieved 2016-09-18.
- ↑ "Anomaly Detection for Time Series Data with Deep Learning". InfoQ. Retrieved 29 April 2023.
- ↑ "Google Code Archive - Long-term storage for Google Code Project Hosting". code.google.com. Retrieved 29 April 2023.
- ↑ "Archived copy". Archived from the original on 2016-03-10. Retrieved 2016-02-22.
{{cite web}}
: CS1 maint: archived copy as title (link) - ↑ "skymind.ai". skymind.ai. Retrieved 29 April 2023.
- ↑ "Archived copy". Archived from the original on 2016-03-10. Retrieved 2016-02-22.
{{cite web}}
: CS1 maint: archived copy as title (link) - ↑ "skymind.ai". skymind.ai. Retrieved 29 April 2023.
- ↑ "DeepLearning4J(Stable) | RapidMiner China". www.rapidminerchina.com. Archived from the original on 18 May 2016. Retrieved 22 May 2022.
- ↑ "Home - WekaDeeplearning4j". deeplearning.cms.waikato.ac.nz. Retrieved 29 April 2023.
- ↑ "Products". Archived from the original on 2017-09-21. Retrieved 2017-09-20.
- ↑ "Model Server for Deep Learning and AI - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM". Archived from the original on 2017-09-21. Retrieved 2017-09-20.
- ↑ "GitHub - deeplearning4j/Dl4j-benchmark: Repo to track dl4j benchmark code". GitHub. 19 December 2019.
- ↑ "Deeplearning4j Benchmarks - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM". Archived from the original on 2017-08-09. Retrieved 2017-01-30.
- ↑ "Scala, Spark and Deeplearning4j - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM". Archived from the original on 2017-02-25. Retrieved 2017-02-25.
- ↑ "Running Keras with Deeplearning4j - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM". Archived from the original on 2017-02-25. Retrieved 2017-02-25.
- ↑ "Deep Learning with Clojure - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM". Archived from the original on 2017-02-25. Retrieved 2017-02-25.
- ↑ "Tensorflow & Deeplearning4j - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM". Archived from the original on 2017-09-08. Retrieved 2017-09-07.