Type Systems(last updated: 20/4/05 11:18 UTC)
back  

Summary

Different programming languages use different type systems.  I believe that the best type system is one which has the least danger of resulting in bugs caused by the programmer assuming the code would compile one way when it actually compiles the other way.

Terminology

The term "strongly typed" often appears when speaking about type systems, but because of varying definitions, I will not use it.  More important than jargon, in my opinion, is code reusability and maintainability.  While a type system does not force a programmer to write good or bad code, it does exert influence.

The Dig

Some languages have explicitly defined datatypes, whereas others do not.  Some languages require the programmer to explicitly convert from one datatype, whereas others do not.  While it can be nice to not have to explicitly define variables and not have to explicitly perform conversions, there is a severe price to pay.  Usually the compiler will convert and represent variables as you expect, but sometimes they will not.  Bugs can arise that are very hard to find, as the programmer is almost sure the code in error is executing correctly.

Another reason for using languages with explicitly defined datatypes and no implicit conversions is that people seem to write better code under these conditions.  It also tends to be easier to read and understand code in this type of language.

Depending on the situation, code execution time might also be an issue.  The system I advocate would perform the fastest in any case.  However, code execution time is often not an issue.

Widening conversions

So, I was not telling the complete truth. There is a class of type conversion that is virtually error free, a conversion that almost every programming languages does implicitly: widening conversions.  If you have a 16-bit integer and add it to a 32-bit integer, the compiler will often extend the 16-bit integer to a 32-bit one.  This rule is not hard and fast -- sometimes the compiler widens depending on what order the two different datatypes are in.  Again, this is not something to worry about, as "good" compilers will warn you before performing conversions that might result in loss of precision.

"Good" languages

Explicitly typed and no implicit conversions.

"Bad" languages

You never know...