LogoAIAny
  • Search
  • Collection
  • Category
  • Tag
  • Blog
LogoAIAny

Tag

Explore by tags

LogoAIAny

Curated AI Resources for Everyone

support@aiany.app
Product
  • Search
  • Collection
  • Category
  • Tag
Resources
  • Blog
Company
  • Privacy Policy
  • Terms of Service
  • Sitemap
Copyright © 2026 All Rights Reserved.
  • All

  • 30u30

  • ASR

  • ChatGPT

  • GNN

  • IDE

  • RAG

  • ai-agent

  • ai-api

  • ai-api-management

  • ai-client

  • ai-coding

  • ai-demos

  • ai-development

  • ai-framework

  • ai-image

  • ai-image-demos

  • ai-inference

  • ai-leaderboard

  • ai-library

  • ai-rank

  • ai-serving

  • ai-tools

  • ai-train

  • ai-video

  • ai-workflow

  • AIGC

  • alibaba

  • amazon

  • anthropic

  • audio

  • blog

  • book

  • bytedance

  • chatbot

  • chemistry

  • claude

  • claude-code

  • course

  • deepmind

  • deepseek

  • engineering

  • finance

  • foundation

  • foundation-model

  • gemini

  • github

  • google

  • gradient-booting

  • grok

  • huggingface

  • LLM

  • llm

  • math

  • mcp

  • mcp-client

  • mcp-server

  • meta-ai

  • microsoft

  • mlops

  • NLP

  • nvidia

  • ocr

  • ollama

  • openai

  • paper

  • physics

  • plugin

  • pytorch

  • RL

  • robotics

  • science

  • security

  • sora

  • translation

  • tutorial

  • vibe-coding

  • video

  • vision

  • xAI

  • xai

Icon for item

NeuralOperator: Learning in Infinite Dimensions

2020
NeuralOperator (GitHub organization)

NeuralOperator is an open-source PyTorch library that implements neural operator architectures (notably Fourier Neural Operators) for learning mappings between function spaces. It targets physics-informed tasks such as PDE modeling, provides resolution-invariant operators, tensorized (Tucker) variants for parameter efficiency, and ready-to-use training and examples.

pytorchgithubai-libraryphysicsmath+2

A Tutorial Introduction to the Minimum Description Length Principle

2004
Peter Grunwald

This paper gives a concise tutorial on MDL, unifying its intuitive and formal foundations and inspiring widespread use of MDL in statistics and machine learning.

foundation30u30papermath

Kolmogorov Complexity and Algorithmic Randomness

2022
A. Shen, V. A. Uspensky +1

This book offers a comprehensive introduction to algorithmic information theory: it defines plain and prefix Kolmogorov complexity, explains the incompressibility method, relates complexity to Shannon information, and develops tests of randomness culminating in Martin-Löf randomness and Chaitin’s Ω. It surveys links to computability theory, mutual information, algorithmic statistics, Hausdorff dimension, ergodic theory, and data compression, providing numerous exercises and historical notes. By unifying complexity and randomness, it supplies rigorous tools for measuring information content, proving combinatorial lower bounds, and formalizing the notion of random infinite sequences, thus shaping modern theoretical computer science.

foundation30u30bookmath
  • Previous
  • 1
  • Next