AI Knowledge Base

Total 1217 articles posts
人工智能治理(AI Governance)是什么,一文看懂

What is Artificial Intelligence Governance (AI Governance) in One Article

AI governance is a comprehensive framework covering technology, ethics, law, and society that effectively guides, manages, and oversees the entire lifecycle of AI systems-from design, development, deployment, and end use. The core goal is not to hinder technological innovation, but to ensure that the development and application of AI technologies begin...
1mos ago
010.3K
自监督学习(Self-Supervised Learning)是什么,一文看懂

What is Self-Supervised Learning (SSL) in one article?

Self-Supervised Learning (SSL) is an emerging learning paradigm in the field of machine learning, where the core idea is to automatically generate supervised signals from unlabeled data and train models to learn useful representations of the data.
1mos ago
09.8K
人工智能安全(AI Safety)是什么,一文看懂

What is Artificial Intelligence Safety (AI Safety), in one article

Artificial Intelligence Safety (AI Safety) is the cutting-edge interdisciplinary field of ensuring that AI systems, especially those that are increasingly powerful and autonomous, act reliably and predictably throughout their lifecycle in accordance with human intent, without harmful consequences.
1mos ago
09.7K
循环神经网络(Recurrent Neural Network)是什么,一文看懂

What is Recurrent Neural Network (RNN) in one article?

Recurrent Neural Network (RNN) is a neural network architecture designed for processing sequential data. Sequential data refers to a collection of data with temporal order or dependencies, such as linguistic text, speech signals, or time series.
4wks ago
08.4K
Transformer 架构(Transformer Architecture)是什么,一文看懂

What is the Transformer Architecture in one article?

The Transformer architecture is a deep learning model designed for processing sequence-to-sequence tasks such as machine translation or text summarization. The core innovation is the complete reliance on self-attention mechanisms, eschewing traditional loops or convolutional structures. Allowing the model to process all elements of a sequence in parallel, large...
3wks ago
06.9K
大语言模型(Large Language Model)是什么,一文看懂

What is the Large Language Model (LLM) in one article?

Large Language Model (LLM) is a deep learning system trained on massive text data, with the Transformer architecture at its core. The self-attention mechanism of this architecture can effectively capture long-distance dependencies in language. The model's "large ...
3wks ago
05.4K
模型微调(Fine-tuning)是什么,一文看懂

What is Fine-tuning, in one article?

Model fine-tuning (Fine-tuning) is a specific implementation of transfer learning in machine learning. The core process is based on pre-trained models, which utilize large-scale datasets to learn generic patterns and develop extensive feature extraction capabilities. The fine-tuning phase then introduces task-specific datasets to ...
2wks ago
04.7K
正则化(Regularization)是什么,一文看懂

Regularization (Regularization) is what, an article to see and understand

Regularization is a core technique in machine learning and statistics to prevent model overfitting. Regularization controls the degree of fitting by adding a penalty term to the objective function that is related to the complexity of the model. Common forms include L1 and L2 regularization: the L1 produces sparse solutions and applies...
3dys ago
01.4K