AI Sharing Circle

I didn't fill in the profile!
梯度下降(Gradient Descent)是什么,一文看懂

What is Gradient Descent (Gradient Descent), an article to read and understand

Gradient Descent is the core optimization algorithm for solving function minimization. The algorithm determines the direction of descent by calculating the gradient of the function (the vector consisting of the partial derivatives of each), and iteratively updating the parameters according to the rule θ = θ - η - ∇J(θ).
1mos ago
08.6K
逻辑回归(Logistic Regression)是什么,一文看懂

What is Logistic Regression (Logistic Regression), an article to read and understand

Logistic Regression is a statistical learning method used to solve binary classification problems. The central goal is to predict the probability that a sample belongs to a particular category based on input features. The model maps the linear output to between 0 and 1 by linearly combining the eigenvalues using an S-shaped function...
1mos ago
010.5K
正则化(Regularization)是什么,一文看懂

Regularization (Regularization) is what, an article to see and understand

Regularization is a core technique in machine learning and statistics to prevent model overfitting. Regularization controls the degree of fitting by adding a penalty term to the objective function that is related to the complexity of the model. Common forms include L1 and L2 regularization: the L1 produces sparse solutions and applies...
1mos ago
010.2K
生成对抗网络(Generative Adversarial Network)是什么,一文看懂

What is Generative Adversarial Network (GAN) in one article?

Generative Adversarial Network (GAN) is a deep learning model proposed by Ian Goodfellow et al. in 2014. The framework implements generative modeling by training two neural networks against each other...
1mos ago
012.4K
自注意力(Self-Attention)是什么,一文看懂

Self-Attention (Self-Attention) is what, an article to read and understand

Self-Attention is a key mechanism in deep learning, originally proposed and widely used in the Transformer architecture. The core idea is to allow the model to simultaneously attend to all positions in the input sequence, and compute each position by weighted aggregation of...
1mos ago
014.9K
多任务学习(Multi-Task Learning)是什么,一文看懂

What is Multi-Task Learning (MTL) in one article?

Multi-Task Learning (MTL) is not an isolated algorithm, but an intelligent machine learning paradigm.
1mos ago
011.2K
扩散模型(Diffusion Model)是什么,一文看懂

Diffusion Model (Diffusion Model) what is it, an article to read and understand

Diffusion Model (Diffusion Model) is a generative model specialized for creating new data samples such as images, audio or text. The core of the model is inspired by the process of diffusion in physics, which simulates the natural diffusion of particles from a region of high concentration to a region of low concentration. In the machine...
2mos ago
014.2K
模型微调(Fine-tuning)是什么,一文看懂

What is Fine-tuning, in one article?

Model fine-tuning (Fine-tuning) is a specific implementation of transfer learning in machine learning. The core process is based on pre-trained models, which utilize large-scale datasets to learn generic patterns and develop extensive feature extraction capabilities. The fine-tuning phase then introduces task-specific datasets to ...
2mos ago
011.8K
注意力机制(Attention Mechanism)是什么,一文看懂

Attention Mechanism (Attention Mechanism) is what, an article to read and understand

Attention Mechanism (Attention Mechanism) is a computational technique that mimics human cognitive processes, initially applied in the field of machine translation, and later becoming an important part of deep learning.
2mos ago
015.7K
Transformer 架构(Transformer Architecture)是什么,一文看懂

What is the Transformer Architecture in one article?

The Transformer architecture is a deep learning model designed for processing sequence-to-sequence tasks such as machine translation or text summarization. The core innovation is the complete reliance on self-attention mechanisms, eschewing traditional loops or convolutional structures. Allowing the model to process all elements of a sequence in parallel, large...
2mos ago
016.4K