Yes the compiler/interpreter can figure it out on the fly, that’s what we mean by untyped languages. And as stated both have their merits and their faults.
Elon doesn’t know what the words mean and just chimes in with his AI future BS.
Well that would depend on the definition and what you exactly mean by untyped.
The untyped part is usually referring to the way the programmer interacts with the language, for example not setting a type for variables and parameters. But then there is the question of is the programmer ever allowed to explicitly set the type. And further more, if the programmer explicitly set the type, does this mean the type can’t change at a later point? And another question could be, can the programmer check or enforce what type a variable or parameter is? And the question, if there is only one type of data in the language, would that be a typed or untyped language? But I would consider these to be details and all fall under the untyped umbrella, with untyped just meaning not-typed.
Then there’s the question of the technical implementation of the language. Defining a language is one thing, actually having it run on a real system is another. Usually technical systems at some point require explicit types. Something somewhere needs instructions on how to handle the data and this usually leads to some kind of typing instructions being added along with the data. But depending on how many abstraction layers there are, this can soon become a very pedantic discussion. I feel what matters is the design, definition and intend of a language. The actual technical implementation isn’t what matters in my opinion.
I feel like there are so many programming languages and technical systems at this point, every variation and exception exists. And if you can think of one that doesn’t exist, expect a follow up comment of somebody pointing out it does exist after all, or them having started a project to make it exist in the near future.
From what I know about those I would consider those to be typed languages. Even if the programmer doesn’t explicitly assign the types, he needs to be aware of them and take into account what type something will be. I am familiar with F# and it’s strongly typed for example.
We’re also at the point where traditionally untyped languages can be strictly typed (strict typescript), and typed languages can be weakly typed (Java’s var)
Programming term. Variables in programming languages can hold different types of data, such as whole numbers, floating point numbers or strings of characters (“text”). Untyped languages figure out on the fly what can and cannot be done to the content of a variable, while typed languages strictly keep track of the type of content (not the value) to catch bugs and improve performance, for example.
Np necessarily. Usually errors are detected at runtime and reported as such. So you will see where your program failed, but it usually crashes nonetjeless. Keep in mind that crashes are usually better than continuing some undefined behavior.
By typed they mean declairing a type for your variables.
In some languages, variables needs to be told what kind of data they can hold. That’s it’s type. For instance a number without decimals would be an integer type. While text might be a string type or a list of character types.
Other languages don’t require types and sometimes don’t even support them. They will just infer the type from the data that’s in the variable.
Yes the compiler/interpreter can figure it out on the fly, that’s what we mean by untyped languages. And as stated both have their merits and their faults.
Elon doesn’t know what the words mean and just chimes in with his AI future BS.
Yes! Just because a compiler could guess the type doesn’t mean it should. Elon didn’t understand the meme at all.
Are there untyped languages? You probably meant ‘dynamically typed languages’.
But even statically typed languages can figure out most types for you from the context - it’s called ‘type inference’.
Most of my code is untyped. First I type it, then I realize it’s all wrong and use backspace to untype it.
This is the dumbest thing I’ve read all week. Congrats. Lol
Well that would depend on the definition and what you exactly mean by untyped.
The untyped part is usually referring to the way the programmer interacts with the language, for example not setting a type for variables and parameters. But then there is the question of is the programmer ever allowed to explicitly set the type. And further more, if the programmer explicitly set the type, does this mean the type can’t change at a later point? And another question could be, can the programmer check or enforce what type a variable or parameter is? And the question, if there is only one type of data in the language, would that be a typed or untyped language? But I would consider these to be details and all fall under the untyped umbrella, with untyped just meaning not-typed.
Then there’s the question of the technical implementation of the language. Defining a language is one thing, actually having it run on a real system is another. Usually technical systems at some point require explicit types. Something somewhere needs instructions on how to handle the data and this usually leads to some kind of typing instructions being added along with the data. But depending on how many abstraction layers there are, this can soon become a very pedantic discussion. I feel what matters is the design, definition and intend of a language. The actual technical implementation isn’t what matters in my opinion.
I feel like there are so many programming languages and technical systems at this point, every variation and exception exists. And if you can think of one that doesn’t exist, expect a follow up comment of somebody pointing out it does exist after all, or them having started a project to make it exist in the near future.
Would you say OCaml or any ml family language would be untyped since they have type inference?
From what I know about those I would consider those to be typed languages. Even if the programmer doesn’t explicitly assign the types, he needs to be aware of them and take into account what type something will be. I am familiar with F# and it’s strongly typed for example.
We’re also at the point where traditionally untyped languages can be strictly typed (strict typescript), and typed languages can be weakly typed (Java’s var)
assembly
VBA you can declare everything as variant.
Is that untyped?
Even in untyped can’t you explicitly set your type either with declarations or wrapping the value in quotes for a string or something?
Depends on the language. There is no explicit typing in JavaScript, for example. That’s why Typescript was invented.
Not always.
Ah, that could be problematic
Untyped as in written? Or is this programming term I’m not familiar with?
Programming term. Variables in programming languages can hold different types of data, such as whole numbers, floating point numbers or strings of characters (“text”). Untyped languages figure out on the fly what can and cannot be done to the content of a variable, while typed languages strictly keep track of the type of content (not the value) to catch bugs and improve performance, for example.
Ah! Thank you for the explanation. That makes much more sense now.
Very concise explanation!
Any untyped languages that don’t care what is in the variable, assumes you know what your doing, and YOLOs it?
Np necessarily. Usually errors are detected at runtime and reported as such. So you will see where your program failed, but it usually crashes nonetjeless. Keep in mind that crashes are usually better than continuing some undefined behavior.
By typed they mean declairing a type for your variables.
In some languages, variables needs to be told what kind of data they can hold. That’s it’s type. For instance a number without decimals would be an integer type. While text might be a string type or a list of character types.
Other languages don’t require types and sometimes don’t even support them. They will just infer the type from the data that’s in the variable.
If you see Elon Musk please explain this to him.
I’m an idiot, and I still don’t think I could dumb it down to his level.
It’s actually hieroglyphics.
Might be able to call assembly untyped. Everything beyond that I think would be called either statically or dynamically typed, maybe weakly typed?