Code Smells sind Hinweise darauf, dass im Code etwas nicht sauber ist - etwas schlecht riecht. Es geht nicht um Syntaxfehler oder Bugs, sondern um Strukturen, die dich langfristig ausbremsen. Der Code funktioniert vielleicht heute, aber er wird schwerer zu verstehen, zu testen und zu erweitern. Gera

https://magicmarcy.de/code-smells-was-riecht-denn-hier-so-streng

#CodeSmells #Methoden #Logik #Parameter #MagicStrings #MagicNumbers #Programming #Awareness

Code Smells - was riecht denn hier so streng? | magicmarcy.de

Code Smells sind Hinweise darauf, dass im Code etwas nicht sauber ist - etwas schlecht riecht. Es geht nicht um Syntaxfehler oder Bugs, sondern um Strukturen, die dich langfristig ausbremsen. Der Code funktioniert vielleicht heute, aber er wird schwerer zu verstehen, zu testen und zu erweitern. Gerade in Java-Projekten sammeln sich solche Stellen schnell an, wenn man sie nicht bewusst wahrnimmt.

magicmarcy.de

LLMs contain a LOT of parameters. But what’s a parameter? – MIT Technology Review

Artificial intelligence

LLMs contain a LOT of parameters. But what’s a parameter?

They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

By Will Douglas Heavenarchive page

January 7, 2026

Photo Illustration by Sarah Rogers/MITTR | Photos Getty

MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

I am writing this because one of my editors woke up in the middle of the night and scribbled on a bedside notepad: “What is a parameter?” Unlike a lot of thoughts that hit at 4 a.m., it’s a really good question—one that goes right to the heart of how large language models work. And I’m not just saying that because he’s my boss. (Hi, Boss!)

A large language model’s parameters are often said to be the dials and levers that control how it behaves. Think of a planet-size pinball machine that sends its balls pinging from one end to the other via billions of paddles and bumpers set just so. Tweak those settings and the balls will behave in a different way.  

OpenAI’s GPT-3, released in 2020, had 175 billion parameters. Google DeepMind’s latest LLM, Gemini 3, may have at least a trillion—some think it’s probably more like 7 trillion—but the company isn’t saying. (With competition now fierce, AI firms no longer share information about how their models are built.)

But the basics of what parameters are and how they make LLMs do the remarkable things that they do are the same across different models. Ever wondered what makes an LLM really tick—what’s behind the colorful pinball-machine metaphors? Let’s dive in.  

What is a parameter?

Think back to middle school algebra, like 2a + b. Those letters are parameters: Assign them values and you get a result. In math or coding, parameters are used to set limits or determine output. The parameters inside LLMs work in a similar way, just on a mind-boggling scale. 

Editor’s Note: Read the rest of the story, at the below link.

 

Continue/Read Original Article Here: LLMs contain a LOT of parameters. But what’s a parameter? | MIT Technology Review

#Billions #DeepMind #Gemini3 #Google #LargeLanguageModels #LLMs #LotsOfParameters #MITTechnologyReview #Parameter #Trillions
Everybody who wants to understand Generative AI should read articles like this.
https://www.technologyreview.com/2026/01/07/1130795/what-even-is-a-parameter
#AI #parameter
LLMs contain a LOT of parameters. But what’s a parameter?

They’re the mysterious numbers that make your favorite AI models tick. What are they and what do they do?

MIT Technology Review

Apple loses fourth AI researcher in a month to Meta’s superintelligence team

Apple Inc. has lost its fourth AI researcher in a month to Meta Platforms Inc., marking the latest…
#NewsBeep #News #Headlines #afm #anthropic #Apple #company #fourthairesearcher #group #Latvia #LV #meta #month #NewYork #OpenAI #parameter #people #superintelligenceteam #work #year
https://www.newsbeep.com/29233/

https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech

This is an actually good article about #AI #energy use, and it's affect on #carbon, and #ClimateChange.

Here are some key things I would add.

All of these #corporations like #OpenAI #Microsoft and #Google are relying on an #LLM being accessed from the #cloud (the #internet).

#NVIDIA is counting on every increasingly big models.

Do I think this is the future of #generativeai

No.

I think _that_ is a big #Bubble. I think every "Size Up" on an AI model gives you an extra 20% in quality.

So, running a LLAMA 8b is only 20% better than running a LLAMA 3b.

Right?

So what?

Well, 99% of the usecase people have don't require 1 Trillion #parameter models.

They require models that increasingly can be run locally.

What do I think is coming? It isn't 30gigawatt data centers, it's laptops running on less energy than current ones with a terrabyte of RAM.

No one wants the #internet that the #MegaCorporations have created. People want their own shit. They want to own their movies.

The #cloud is dead. The companies built on the cloud are dead.

The just don't know it yet.

@mittechreview

We did the math on AI’s energy footprint. Here’s the story you haven’t heard.

The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next.

MIT Technology Review

Just had an interesting (but a bit unsettling) conversation with #AI about what it thought its future abilities would be. It’s a long response but worth the read.

My question: What new #emergent #abilities will occur when #parameter #numbers can be increased by orders of #magnitude?

AI’s response: This is a fascinating and actively researched question in AI! As language models like GPT grow in parameter count by orders of magnitude (e.g., from billions to tens or hundreds of trillions),

1/8

A cycloidal pendulum - one suspended from the cusp of an inverted cycloid - is isochronous, meaning its period is constant regardless of the amplitude of the swing. Please find the proof using energy methods: Lagrange's equations (in the images attached to the reply).

Background:
The standard pendulum period of \(2\pi\sqrt{L/g}\) or frequency \(\sqrt{g/L}\) holds only for small oscillations. The frequency becomes smaller as the amplitude grows. If you want to build a pendulum whose frequency is independent of the amplitude, you should hang it from the cusp of a cycloid of a certain size, as shown in the gif. As the string wraps partially around the cycloid, the effect decreases the length of the string in the air, increasing the frequency back up to a constant value.

In more detail:
A cycloid is the path taken by a point on the rim of a rolling wheel. The upside-down cycloid in the gif can be parameterized by \((x, y)=R(\theta-\sin\theta, -1+\cos\theta)\), where \(\theta=0\) corresponds to the cusp. Consider a pendulum of length \(L=4R\) hanging from the cusp, and let \(\alpha\) be the angle the string makes with the vertical, as shown (in the proof).

#Pendulum #Cycloid #Period #Frequency #SHM #TimePeriod #CycloidalPendulum #Lagrange #Cusp #Energy #KineticEnergy #PotentialEnergy #Lagrangian #Length #Math #Maths #Physics #Mechanics #ClassicalMechanics #Amplitude #CircularFrequency #Motion #Vibration #HarmonicMotion #Parameter #ParemeterizedEquation #GoverningEquations #Equation #Equations #DifferentialEquations #Calculus

Keylength - Cryptographic Key Length Recommendation

Easily find the minimum cryptographic key length recommended by different scientific reports and governments.

Keylength - Compare all Methods

Easily compare the minimum cryptographic key length recommended by different scientific reports and governments.

SciTech Chronicles. . . . . . . . .Feb 11, 2025

  Lightly browned in the fires of Hell Vol II No 37 350 links Curated Stash of 'devil's money' found at cult site in the Netherlands. https:...