[数] 算法的
... AlgorithmicRoutine演算例程 Algorithmic算法的 ali:acerlab宏棋实验室 ...
算术的
... algorithmicLanguage算法语言算法语言 algorithmic算术的 Alhena=γGeminorum井宿三 ...
算法级
– 算法级 (algorithmic): 用高级语言结 构实现..
算法
... algebraic 代数的 algorithmic 算法的 biphase 双相的 ...
计 算法语言 ; 算法说话 ; 算法语言算法语言 ; 演算法语言
计 算法状态机
算法信息论 ; 算法信息理论
In mathematics and computer science, an algorithm (i/ˈælɡərɪðəm/ AL-gə-ri-dhəm) is a self-contained step-by-step set of operations to be performed. Algorithms exist that perform calculation, data processing, and automated reasoning.An algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.The concept of algorithm has existed for centuries, however a partial formalization of what would become the modern algorithm began with attempts to solve the Entscheidungsproblem (the "decision problem") posed by David Hilbert in 1928. Subsequent formalizations were framed as attempts to define "effective calculability" or "effective method"; those formalizations included the Gödel–Herbrand–Kleene recursive functions of 1930, 1934 and 1935, Alonzo Church's lambda calculus of 1936, Emil Post's "Formulation 1" of 1936, and Alan Turing's Turing machines of 1936–7 and 1939. Giving a formal definition of algorithms, corresponding to the intuitive notion, remains a challenging problem.