News
Using Real-Life Trial dataset, recognized as the most valuable high-stake dataset, we demonstrate that Multi-layer Perceptron achieved the highest accuracy of 88% and a recall of 92.86%. Along with ...
The model, called TTT-MLP (Test-Time Training-Multilayer Perceptron), uses TTT layers. This enhances the capabilities of pre-trained transformers by allowing their hidden states to be neural networks.
Leveraging Generative AI based ST-Wave MLP introduces the repeatable ST-WaveBlock structure, which is based on multilayer perceptron’s (MLPs). Each ST-WaveBlock includes two key modules: The ...
Recent open-vocabulary detectors achieve promising performance with abundant region-level annotated data. In this work, we show that an open-vocabulary detector co-training with a large language model ...
The model was constructed using data related to brain strokes. The aim of this work is to use Multi Layer Perceptron (MLP) as a classification technique for stroke data and used multi-optimizers that ...
If you own or find some overlooked SNN papers, you can add them to this document by pull request. [2025.04.11] Update SNN-related papers in ICLR 2025 (11 papers), CVPR 2025 (13 papers). [2025.02.06] ...
Figure 1. Number of publications in dependence on the publication year for DL, deep learning; CNN, convolutional neural network; DBN, deep belief network; LSTM, long short-term memory; AEN, ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results