Wednesday, January 7, 2026

BYD vs. Tesla EV Production Comparison

BYD vs. Tesla: 2025 Electric Vehicle Production & Sales

Yes, BYD has surpassed Tesla in pure electric vehicle (BEV) production and sales for the 2025 calendar year. This marks the first full year BYD has taken the top spot from Tesla, which had been the leader for years.

2025 Annual Performance & Market Position

🏆 BYD (2025)
Pure Electric (BEV) Sales 2.26 million
Total Vehicle Sales (BEV + PHEV) 4.55 million
Key Market Global leader; Strong growth in Europe
Current Position World's largest EV manufacturer
🔋 Tesla (2025)
Pure Electric (BEV) Sales 1.64 million
Total Vehicle Sales (BEV + PHEV) 1.64 million (BEV only)
Key Market Global; Facing challenges in core markets
Current Position Previously the world's largest
🔍 Key Factors Behind the Shift

The change in leadership results from a combination of different trajectories for the two companies in 2025:

BYD's Global Growth
BYD achieved a 27.9% year-on-year increase in BEV sales.
A major driver was its overseas success, with exports surging by 150.7% to over 1 million vehicles.
Despite a challenging market in China, BYD's global expansion, including new factories in places like Hungary, helped secure the top spot.
Tesla's Sales Decline
Tesla's annual sales fell by approximately 9% in 2025.
Political and Brand Factors: Elon Musk's political activities and alignment with the Trump administration are cited as having alienated some customers and negatively impacted the brand. A Yale University study suggested this could have significantly reduced Tesla's sales potential.
Policy Changes: The withdrawal of U.S. federal EV subsidies under the Trump administration hurt demand.
Product Transition: Tesla's production was affected by the ramp-down of the old Model Y and the ramp-up of its successor, making the popular model unavailable for months.
📊 A Look at the Broader Picture
Defining the "Largest": When comparing, the term "largest EV maker" typically refers only to Battery Electric Vehicles (BEVs). If you include Plug-in Hybrid Electric Vehicles (PHEVs) in the count, BYD has been the overall "New Energy Vehicle" leader for several years due to its strong hybrid lineup.
Profit vs. Volume: Despite selling fewer cars, Tesla has historically been the far more profitable company. A comment on one industry report notes that Tesla still makes the vast majority of global EV profits, which funds its other ventures like energy storage and robotics.
Looking Ahead: For 2026, BYD aims to sell 1.6 million vehicles outside China, while Tesla's sales are forecast to recover slightly to around 1.75 million. However, analysts widely expect the intense competition between these two, and with other global automakers, to continue.
Data reflects 2025 calendar year production and sales figures. The comparison focuses on Battery Electric Vehicle (BEV) volumes for the "world's largest EV maker" title.
Right Triangle Sides Explained

Understanding Hypotenuse, Adjacent, and Opposite Sides

Important: Adjacent and opposite are sides of a right triangle, defined relative to a specific acute angle. The hypotenuse is fixed.

1. The Hypotenuse

Definition
The longest side of a right triangle.
Location
It is always the side opposite the right angle (90° angle).
Key Fact
It never changes for a given triangle and is always the hypotenuse, no matter which acute angle you're using as your reference.

2. Adjacent Side (Relative to a chosen angle)

Definition
The leg that forms the chosen acute angle, along with the hypotenuse.
Memory Aid
The side touching or next to the angle (other than the hypotenuse).

3. Opposite Side (Relative to a chosen angle)

Definition
The leg that is across from the chosen acute angle. It does not form the angle.
Memory Aid
The side facing the angle.

Visual Explanation

View from Angle θ (Theta)

θ
Hypotenuse
Adjacent (to θ)
Opposite (to θ)
  • Hypotenuse: The slanted side (always)
  • Adjacent: The bottom horizontal leg (touching θ)
  • Opposite: The vertical leg (across from θ)

View from Angle α (Alpha)

α
Hypotenuse
Adjacent (to α)
Opposite (to α)
  • Hypotenuse: The same slanted side (unchanged)
  • Adjacent: The vertical leg (now touching α)
  • Opposite: The bottom horizontal leg (now across from α)
Notice: The Opposite side for θ is the Adjacent side for α, and vice-versa. The sides swap roles when you change reference angles!

Connection to Trigonometry

This naming convention is the foundation of the three primary trigonometric ratios:

Function Ratio Explanation
Sine (sin) Opposite / Hypotenuse Compares the side opposite the angle to the hypotenuse
Cosine (cos) Adjacent / Hypotenuse Compares the side adjacent to the angle to the hypotenuse
Tangent (tan) Opposite / Adjacent Compares the side opposite to the side adjacent to the angle

Example: In the first triangle above, for angle θ:

  • sin θ = (Opposite to θ) / Hypotenuse
  • cos θ = (Adjacent to θ) / Hypotenuse
  • tan θ = (Opposite to θ) / (Adjacent to θ)
SOH-CAH-TOA
(The classic mnemonic for remembering trigonometric ratios)
SOH
Sine = Opposite / Hypotenuse
CAH
Cosine = Adjacent / Hypotenuse
TOA
Tangent = Opposite / Adjacent

Key Takeaway

Always ask: "Which acute angle am I using as my reference point?" Once you pick the angle:

  • Hypotenuse is fixed (opposite the right angle).
  • Opposite is the side directly across from your chosen angle.
  • Adjacent is the side next to your angle that isn't the hypotenuse.

This HTML page visually explains the concepts of hypotenuse, adjacent, and opposite sides in right triangles.

Tuesday, January 6, 2026

Radians vs. Degrees

What is More Important: Radians or Degrees?

Radians are fundamentally more important for mathematics and physics, while degrees are more intuitive for everyday life.

Think of it this way: Radians are the "native language" of angles, built into the very structure of math. Degrees are a convenient, human-made translation.

The Case for Radians (Why They Are More Important)

Natural Connection to Circles

One radian is defined as the angle created when you take the radius of a circle and wrap it along the circumference. The formula for arc length becomes beautifully simple: Arc Length = Radius × Angle (in radians), or s = rθ. This formula doesn't work cleanly with degrees without a conversion factor.

Calculus & Higher Math Works Beautifully

This is the most critical reason. The derivative of sin(x) is cos(x) only if x is in radians. Taylor series expansions and other advanced mathematical tools only work naturally when angles are measured in radians. They are the "natural unit" that makes the math of waves, oscillations, and growth clean and elegant.

They Are Unitless

A radian is a ratio of two lengths (arc length / radius), so it has no dimension. This makes it seamlessly integrable into physics formulas, like angular velocity (ω = θ/t).

Universal in Science and Engineering

Advanced fields like physics, engineering, and computer graphics exclusively use radians. To understand signal processing, orbital mechanics, or quantum physics, you must use radians.

The Case for Degrees (Why They Persist)

Human Intuition

The base-360 system is highly divisible (by 2, 3, 4, 5, 6, 8, 9, 10, 12...), which is excellent for mental estimation and simple geometry. A right angle (90°) is easy to visualize and communicate.

Historical & Cultural Pervasiveness

Degrees have been used for millennia in navigation, construction, and basic geography. They are the first unit of angle measurement most people learn.

Practical for Simple Tasks

For telling time (360° for a clock face), reading a compass (bearing 45°), or cutting a pie, degrees are perfectly adequate and intuitive.

Analogy: Temperature

Degrees are like Fahrenheit or Celsius – practical for everyday use ("it's 70°F outside").
Radians are like Kelvin – the absolute, scientific scale where fundamental physical laws work simply and directly.

The Verdict

For calculation, theory, and advanced STEM fields: Radians are unquestionably more important. They are the correct and natural unit.

For communication, basic geometry, and everyday life: Degrees are more common and intuitive.

How to think about it: You need to be bilingual. Learn to think in both, but understand that radians are the language in which the universe's mathematical laws are most simply written. When in doubt in a technical or mathematical context, use radians.

Saturday, January 3, 2026

Systems Framework: Natural Epistemology to AI

Systems Framework: From Natural Epistemology to Artificial Intelligence

A structured architecture mapping human cognition to AI systems through properties, parameters, and attributes.

Core Philosophical Framework: From Natural to Artificial

Natural Epistemology (Human): Senses + Intelligence + Objects → Perception & Knowledge
Artificial Intelligence (System): Sensors + Algorithms + Data → Models & Actions
Key Insight: Properties, Parameters, and Attributes serve as the formal, quantifiable representations of the qualities perceived by senses and reasoned about by intelligence.

Systems Architecture: A Three-Layer Model

This model maps directly to the hardware, firmware, software paradigm, creating a coherent information pipeline.

Layer 1: Hardware Layer (The "Body" & Raw Interface)

Correlate to:

The Five Senses + Physical Objects.

Role:

Transduces physical phenomena (objects, events) into structured digital data.

Incorporating Properties, Parameters, Attributes:

Attributes (Intrinsic to Objects/Signals): These are the raw, measurable qualities of the physical world.

Example (Vision): Pixel luminance (brightness), wavelength (color), spatial coordinates.

Example (Audio): Frequency, amplitude, phase.

Example (Touch Sensor): Resistance, capacitance, pressure (psi).

System Format: These are low-dimensional, physically-grounded data vectors from sensors. They are the atomic primitives of the system's perception. A parameter here might be the sampling rate (firmware-defined) that governs how these attributes are captured.
Layer 2: Firmware / Middleware Layer (The "Perceptual Spine")

Correlate to:

Lower-level, quasi-reflexive perception and signal processing (the "hardwired" parts of intelligence).

Role:

Converts low-level attributes into higher-level properties and features. This layer performs invariant detection and filtering.

Incorporating Properties, Parameters, Attributes:

Properties (Derived & Relational): These are computed interpretations of combined attributes.

Example: From pixel attributes (color, brightness), compute the property texture = {"rough", "smooth"} or edge_strength = 0.87.

Example: From audio attributes (frequencies), compute the property pitch = 440Hz or phoneme = "/ae/".

Parameters (The Tunable Knobs): This layer is parameter-heavy. These are the fixed or tunable settings that control how attributes are synthesized into properties.

Examples: Edge detection kernel coefficients, filter cut-off frequencies, time-window sizes, noise-floor thresholds, activation functions in a neural net layer.

System Format: A pipeline of parameterized transforms (e.g., DSP filters, convolutional kernels, spectral analyzers). Its output is a feature vector—a structured set of properties ready for cognitive software.
Layer 3: Software / Cognitive Layer (The "Mind")

Correlate to:

Higher-order Intelligence + Synthesis.

Role:

Uses properties and features from the firmware layer to form abstract representations, make decisions, learn, and act. This is where epistemology becomes explicit.

Incorporating Properties, Parameters, Attributes:

Attributes (in the software sense): Now become symbolic or semantic labels attached to conceptual objects.

Example: An object in a knowledge graph has attributes: {"type": "cat", "size": "medium", "affectionate": True}. These are high-level assertions.

Properties are used as evidence to assign these attributes via classification (if "furry" and "meows" then type:cat).

Parameters (The Learned & Adaptive Core): These are the learnable weights of models (e.g., weights in a Deep Neural Network, probabilities in a Bayesian network, rule weights in an expert system).

These parameters encode the system's epistemology—its "beliefs" about how sensory properties correlate with conceptual attributes and categories. They are updated via learning algorithms.

System Format: Models (e.g., neural networks, probabilistic graphs, symbolic KBs) defined by:
1. Architecture/Logic (the fixed structure of reasoning).
2. Parameters (the malleable knowledge within that structure).
3. Input/Output Schemas (mapping perceptual properties to cognitive attributes and actions).
Unified Systems View: The Information Flow
Natural Epistemology Component AI System Component Form of Representation
(Properties, Parameters, Attributes)
Object in the World Data Source / Target Has physical attributes (mass, reflectivity, etc.).
Sense Organ (e.g., Eye) Hardware Sensor (e.g., Camera) Outputs signal attributes (pixel arrays). Governed by physical parameters (exposure, gain).
Perceptual Processing Firmware/Middleware Layer Transforms signal attributes into perceptual properties (edges, textures). Uses algorithmic parameters (filter coefficients).
Intelligence (Understanding) Software/Cognitive Layer Maps properties to semantic attributes (identity, intent, risk). Uses model parameters (neural weights) refined by learning.
Knowledge Internal Model State A structured network where entities have attributes and relations have properties. The model's parameters are the encoded knowledge.
Action / Expression Actuators & Outputs Commands defined by control parameters, which are functions of the system's state (attributes + properties).

Pedagogical Value of This Framework

Demystifies AI

It shows AI not as magic, but as a systematic engineering implementation of the natural process of knowing.

Clarifies Terminology

Attribute: A qualifier. Can be low-level (sensor data) or high-level (semantic label).

Property: A descriptive characteristic derived from relationships or computations. It often sits between raw data and abstract knowledge.

Parameter: A system variable that controls a process. It can be fixed (design choice), tunable (knob), or learned (the essence of AI).

Emphasizes the Pipeline

Students see that intelligence is built on a layered transformation of representations, each with its own type of parameters and attributes.

Unifies Symbolic and Sub-Symbolic AI

High-level symbolic attributes (e.g., dangerous) can be grounded in sub-symbolic properties (e.g., rapid_looming_motion = true) via parameterized models.

In essence: By adopting this systems format, you teach that building an AI is the process of designing a pipeline that transforms physical attributes into cognitive attributes, mediated by parameters that are either engineered or learned. This perfectly captures the transition from natural epistemology to artificial intelligence.

Systems Framework for AI Education | Natural Epistemology to Artificial Intelligence

NASA's Dragonfly Mission to Titan

NASA's Dragonfly Mission to Titan

🚀 ACTIVE DEVELOPMENT | LAUNCH: 2028 | ARRIVAL: 2034

🚀 Mission Overview & Status

Primary Objective Investigate Titan's prebiotic chemistry and habitability to understand the origins of life.
Launch Window Planned for July 5-25, 2028.
Titan Arrival Scheduled for 2034.
Mission Duration 3.3 years of surface operations.
Current Status In development and testing; passed its Critical Design Review in April 2025, authorizing full-scale construction.

🛰️ The Spacecraft: An Innovative Titan Flyer

Dragonfly is an octocopter—a rotorcraft with eight rotors, roughly the size of a small car. It leverages Titan's unique environment:

Key Environmental Advantages
Dense Atmosphere Four times denser than Earth's, making flight efficient.
Low Gravity About 1/7th of Earth's, reducing the power needed to fly.
Power Source A Multi-Mission Radioisotope Thermoelectric Generator (MMRTG). This nuclear battery recharges Dragonfly's lithium-ion batteries during the 8-Earth-day-long Titan night.

🪐 Why Titan? A World of Prebiotic Chemistry

Titan is a high-priority target for astrobiology because it resembles a frozen version of early Earth.

Organic Rich Its atmosphere and surface are filled with complex, carbon-rich molecules—the building blocks of life.
Active "Hydrological" Cycle Features clouds, rain, rivers, and lakes of liquid methane and ethane, similar to Earth's water cycle.
Subsurface Ocean Evidence suggests a global saltwater ocean beneath its icy crust, a potential habitat.

🔬 Scientific Goals & Instruments

Dragonfly will fly to dozens of distinct locations across Titan to sample and analyze surface materials. Its instruments will:

Analyze Surface Composition Use drills and a mass spectrometer (DraMS) to identify organic molecules.
Probe Beneath the Surface A gamma-ray and neutron spectrometer (DraGNS) will detect subsurface elements.
Monitor Environment A geophysics and meteorology package (DraGMet), including a seismometer provided by JAXA, will measure weather and "Titanquakes".
Scout & Image A suite of cameras (DragonCam) will capture aerial and microscopic images.

🗺️ The Flight Plan: An Epic Journey

Dragonfly will embark on an ambitious aerial expedition:

Initial Landing Touchdown in the Shangri-La dune fields, similar to linear dunes on Earth.
"Leapfrog" Exploration It will perform short flights, eventually building up to journeys of up to 5 miles (8 km) per hop.
Final Destination The mission aims to reach the Selk impact crater, where past liquid water likely mixed with organic material.
Total Travel Over its mission, Dragonfly is expected to fly more than 108 miles (175 km), vastly exceeding the range of Mars rovers.

⚙️ Recent Progress & Challenges

The mission is making tangible progress but has faced hurdles:

Recent Milestones Successful testing of rotors in Titan-like conditions, delivery of flight radios, and fabrication of the protective aeroshell are on track for a 2028 launch.
Management Notes A NASA Office of Inspector General report noted the mission's launch delay from 2026 to 2028 and a significant cost increase, partly due to supply chain issues and the pandemic.

🏗️ A Global Collaborative Effort

The mission is led by NASA's Johns Hopkins Applied Physics Laboratory (APL) and involves an international team:

Principal Investigator Dr. Elizabeth "Zibi" Turtle.
Key Partners: NASA Goddard, Lockheed Martin, NASA Ames, and others.
International Contributors:
JAXA (Japan)
CNES (France)
DLR (Germany)

In short, Dragonfly is a groundbreaking mission that will use revolutionary technology to explore one of the most Earth-like and chemically rich worlds in our solar system, seeking clues to how life begins.

Friday, January 2, 2026

Top 3 AI Achievements Forecast for 2026

AI Forecast 2026

The Top Three Anticipated Achievements in Artificial Intelligence

Based on expert predictions for 2026, the top three AI achievements will revolve around AI becoming a true collaborative partner, transforming from a tool into a proactive teammate in both professional and scientific work.

In summary, 2026 is expected to be less about flashy new models and more about AI integration—transforming how we work, discover new knowledge, and build technology through practical, collaborative partnerships.

The Rise of AI 'Coworkers' and Agents

This trend marks a shift from AI that responds to questions to systems that can take initiative and complete multi-step tasks autonomously.

These "digital colleagues" are predicted to become common in workplaces, managing workflows and complex projects with minimal human input.

Work Impact Business Impact

AI Transforming Scientific and Medical Discovery

AI will evolve from a research aid to an active participant in the scientific process, capable of generating hypotheses and even controlling parts of experiments.

In medicine, 2026 is expected to bring a "ChatGPT moment", with powerful new foundation models that could diagnose rare diseases and move AI tools from trials into standard clinical practice.

Work Impact Challenge

Sophisticated, Context-Aware AI for Developers

AI coding assistants will gain "repository intelligence", understanding the full context, history, and relationships within a codebase.

This allows them to make smarter architectural suggestions and automate fixes, fundamentally accelerating software development.

Work Impact Business Impact

📈 What These Changes Mean

For Work

Routine and administrative tasks will increasingly be offloaded to AI agents, changing job roles and requiring new skills for managing AI collaboration.

For Business

Success will depend on integrating these AI agents into real workflows to achieve measurable results, moving beyond experimental pilots.

Broader Challenges

The year will also focus on evaluating AI's real impact, managing the risks of AI-generated synthetic content, and addressing the significant energy demands of AI infrastructure.

Forecast based on expert predictions for AI development in 2026.

What field are you most interested in regarding these AI developments?

Actors and Patrons in the Yemen Conflict

Actors and Patrons in the Yemen Conflict

A complex web of local factions, regional powers, and international actors

Conflict Overview

The Yemen conflict involves a complex mix of local warring factions, regional powers backing different sides, and influential international actors. The situation is fluid, with recent military actions in late 2025 shifting control in southern Yemen.

Main Yemeni Factions & Their Patrons

The Houthis (Ansar Allah)
Primary Patron: Iran
Support: Weapons, training, ideological support (since ~2009)
Control: Northern Yemen (Capital Sana'a and northwest)
Presidential Leadership Council (PLC)
Primary Patron: Saudi Arabia
Role: Internationally recognized government; formed in 2022 to unify anti-Houthi forces
Control: Fragmented territories, including Marib and Taiz
Southern Transitional Council (STC)
Primary Patron: United Arab Emirates (UAE)
Role: Separatist group aiming to restore an independent southern state
Control: Southern Yemen (including Aden and 8 governorates)
Al-Qaeda & Islamic State
Primary Patron: Largely Autonomous
Role: Terrorist groups exploiting the conflict
Control: Hinterlands and some coastal areas

Key Regional & International Actors

Saudi Arabia
Primary Role
Leads military coalition against Houthis since 2015; primary patron of the PLC
United Arab Emirates (UAE)
Primary Role
Key coalition partner; shifted to backing the STC and local militias
Iran
Primary Role
Primary backer of the Houthis, providing weapons, training, and ideological guidance
United States & United Kingdom
Primary Role
Provided support to the Saudi coalition; US has scaled back some support
United Nations
Primary Role
Brokers ceasefires and humanitarian efforts; passed key resolutions like the arms embargo

Key Points to Understand the Conflict's Nature

  • Multilayered Conflict: The Yemen war is not a simple two-sided fight but involves multiple groups with shifting alliances.
  • Temporary Alliances: Groups frequently unite based on a "common adversary" rather than shared goals, leading to unstable coalitions.
  • Patron-Client Complexities: External patrons like Iran or the UAE provide support, but their Yemeni allies fiercely guard their autonomy.
  • Internal Divisions: The anti-Houthi bloc (PLC and STC) are allies against the Houthis but have fought each other for control of the south.

Note: The situation in Yemen is highly fluid, with control of territories and alliances subject to change. The information above represents a snapshot of the main actors based on recent reporting up to early 2025. For the very latest developments, consult current news sources and official UN reports.

BYD vs. Tesla EV Production Comparison BYD vs. Tesla: 2025 Electric Vehicle Production & Sales ...