LBANN: Livermore Big Artificial Neural Network Toolkit
The Livermore Big Artificial Neural Network toolkit (LBANN) is an open-source, HPC-centric, deep learning training framework that is optimized to compose multiple levels of parallelism.
LBANN provides model-parallel acceleration through domain decomposition to optimize for strong scaling of network training. It also allows for composition of model-parallelism with both data parallelism and ensemble training methods for training large neural networks with massive amounts of data. LBANN is able to take advantage of tightly-coupled accelerators, low-latency high-bandwidth networking, and high-bandwidth parallel file systems.
LBANN supports state-of-the-art training algorithms such as unsupervised, self-supervised, and adversarial (GAN) training methods in addition to traditional supervised learning. It also supports recurrent neural networks via back propagation through time (BPTT) training, transfer learning, and multi-model and ensemble training methods.
Users are advised to view the Doxygen API Documentation for API information.
- Operators
- Operator
- Abs
- Acosh
- Acos
- Add
- AddConstant
- Asin
- Asinh
- Atan
- Atanh
- BinaryCrossEntropy
- BooleanAccuracy
- BooleanFalseNegative
- BooleanFalsePositive
- Ceil
- Clamp
- ConstantSubtract
- Cos
- Cosh
- Divide
- Equal
- EqualConstant
- Erf
- ErfInv
- Exp
- Expm1
- Floor
- Gelu (GELU tanh approximation)
- Greater
- GreaterConstant
- GreaterEqual
- GreaterEqualConstant
- Less
- LessConstant
- LessEqual
- LessEqualConstant
- Log
- Log1p
- LogSigmoid
- LogSoftmax
- LogicalAnd
- LogicalNot
- LogicalOr
- LogicalXor
- Max
- MaxConstant
- Min
- MinConstant
- Mod
- Multiply
- Negative
- NotEqual
- NotEqualConstant
- Pow
- Reciprocal
- Round
- Rsqrt
- SafeDivide
- SafeReciprocal
- Scale
- Select
- Selu
- Sigmoid
- SigmoidBinaryCrossEntropy
- Sign
- Sin
- Sinh
- Softplus
- Softsign
- Sqrt
- Square
- SquareDifference
- Subtract
- SubtractConstant
- Tan
- Tanh