Matyas.
ServicesProjectsExperienceBlogContact
CSGet in touch
Back to Dictionary
ai

Neural Network

A neural network is a computational model inspired by the human brain, consisting of layers of interconnected nodes (neurons) that process data by adjusting weighted connections during training. Deep neural networks with many layers form the foundation of modern AI, powering everything from image recognition to language understanding. Common architectures include feedforward networks, convolutional networks (CNNs), and transformers.

#ai

Related Terms

Chain of Thought

Chain of Thought (CoT) is a prompting technique that encourages an LLM to break down complex reasoning into intermediate steps before arriving at a final answer. By explicitly reasoning through each step, models achieve significantly better accuracy on math, logic, and multi-step problems. Extended thinking and "thinking" tokens in models like Claude represent a built-in form of chain-of-thought reasoning.

Hallucination

In AI, hallucination refers to when a language model generates confident-sounding but factually incorrect or fabricated information. This occurs because LLMs predict statistically likely text rather than retrieving verified facts. Mitigation strategies include RAG, grounding responses in source documents, structured output validation, and using temperature settings to reduce creative deviation.

Computer Vision

Computer vision is a field of AI that trains machines to interpret and understand visual information from images and videos. Applications include object detection, facial recognition, autonomous driving, and medical image analysis. Modern computer vision leverages deep learning models like CNNs and vision transformers (ViT), and increasingly integrates with language models in multimodal AI systems.

Generative AI

Generative AI is a category of artificial intelligence that creates new content — text, images, code, music, or video — rather than just analyzing or classifying existing data. Powered by architectures like transformers and diffusion models, generative AI has transformed software development with tools like GitHub Copilot, Claude, and Cursor. It represents a shift from AI as a classification tool to AI as a creative collaborator.

Context Window

A context window is the maximum amount of text (measured in tokens) that an LLM can process in a single interaction, encompassing both the input prompt and the generated output. Larger context windows allow models to handle longer documents, maintain extended conversations, and reason over more information at once. Context window sizes have grown rapidly — from 4K tokens in early GPT models to over 1M tokens in current models like Claude.

RAG

Retrieval-Augmented Generation (RAG) is a technique that enhances LLM responses by retrieving relevant documents from an external knowledge base before generating an answer. This allows the model to ground its output in up-to-date, domain-specific information rather than relying solely on its training data. RAG is widely used in enterprise chatbots, documentation assistants, and search-powered AI applications.

All Words
Matyas.

Web apps, mobile apps, AI automation. I help businesses save time and money with tech that actually works.

Links

  • Services
  • Projects
  • Experience
  • Blog
  • Dictionary
  • Contact

Coming Soon

  • Case StudiesSoon
  • Resources

© 2026 Matyas Prochazka. All rights reserved.