- Joined
- Jun 15, 2022
- Messages
- 101
- Reaction score
- 0
So I was thinking: ideal computer programming consists of accounting for every single thing.
This sounds kind of absurd.
The reality of this, however, is freaking nuts.
If we were to do this with documentation, we would end up with some basic stuff accounted for (e.g. "10", lets say in php, would point to general documentation about integers, and possibly to information about what 10 itself is and means, but we would have documentation that was perfect to the nth degree, which had karat).
What is the status of this kind of thing in the computer programming world (and beyond!)?
This sounds kind of absurd.
The reality of this, however, is freaking nuts.
If we were to do this with documentation, we would end up with some basic stuff accounted for (e.g. "10", lets say in php, would point to general documentation about integers, and possibly to information about what 10 itself is and means, but we would have documentation that was perfect to the nth degree, which had karat).
What is the status of this kind of thing in the computer programming world (and beyond!)?