At 3/9/08 06:58 AM, CybexALT wrote: Which concludes my argument that maths is better than physics.
ofcourse, at the same time our maths is built upon a set of axioms that are assumed to be true, and from which everything else is derived :)
At 3/9/08 06:58 AM, CybexALT wrote: Which concludes my argument that maths is better than physics.
ofcourse, at the same time our maths is built upon a set of axioms that are assumed to be true, and from which everything else is derived :)
At 3/9/08 07:12 AM, dELtaluca wrote:At 3/9/08 06:58 AM, CybexALT wrote: Which concludes my argument that maths is better than physics.ofcourse, at the same time our maths is built upon a set of axioms that are assumed to be true, and from which everything else is derived :)
I had a boner, then you guys started talking and now it's gone :(
SIG YOINK!
At 3/9/08 07:17 AM, JPI wrote: I had a boner, then you guys started talking and now it's gone :(
Don't worry, Luis will help you get it back :)
At 3/9/08 07:12 AM, dELtaluca wrote:At 3/9/08 06:58 AM, CybexALT wrote: Which concludes my argument that maths is better than physics.ofcourse, at the same time our maths is built upon a set of axioms that are assumed to be true, and from which everything else is derived :)
What are these axioms? Things like a*b = ab, or much more simple things?
How are you doing in maths, delta? Am I beating you yet?
C1: 100
C2: 96
C3: 100
C4: 97
FP1: 94
FP2: finished the course, not taken A level yet
FP3: half way through
M1: 87
M2: 83
M3: 89
M4: finished course, not yet taken A level
S1: 100
At 3/9/08 07:41 AM, CybexALT wrote: How are you doing in maths, delta? Am I beating you yet?
when were you not?
C1: 100 - 94
C2: 96 - 91
C3: 100 - 89
C4: 97 - 86
FP1: 94 - 94
FP2: finished the course, not taken A level yet - same
FP3: half way through - not started yet, doing in summer with FP2
M1: 87 - 100
M2: 83 - 94
M3: 89 - not started yet
M4: finished course, not yet taken A level - not started yet
S1: 100 - 82 ( I HATE STATS )
D1 : 97
ofcourse, i did my C1,2 in year 10, and C2,3, M1,M2, S1 in year 11. silly GCSE maths.
either way you know i still kick your ass relentless of marks :P
---
in regard to axioms, i refer to things such as this: http://en.wikipedia.org/wiki/Peano_postu lates
Rather than get drawn into a debate, I'm just going to pick up on everyone's little errors :s
At 3/8/08 07:39 PM, J1mp wrote: The fact is the sun weighs a LOT more than earth, so earth is pulled towards it.
There's a distinction between mass and weight :P The sun has more mass than the earth. Mass is the quantity of matter; weight is the force experienced due to mass, and is dependant on the gravitational field the object inhabits.
At 3/8/08 07:39 PM, J1mp wrote: Gravity is not a theory, it is a fact.
Well, it would be a fact if you're talking about the physical phenomenon of being pulled towards the centre of Earth (unless you think that Earth is flat :/ ). The explanations of this phenomenon, such as your old skool inverse-square law, are theories, which, by the way, all space programs up to this point have relied on the truth of.
At 3/9/08 06:58 AM, CybexALT wrote: However, if you said to me prove that sqrt(2) is irrational, or prove that a^2 + b^2 = c^2 in a right angled triangle, then I could prove that in a way that leaves no doubt in anyones mind.
That's actually a pretty interesting topic in itself :P Like the testing of every combination of particles in the universe, I don't think even this could be proved conclusively. We all know that humans are fallible, so no matter how conclusively you think you've proved that sqrt(2) is irrational, or sin/cos = tan, or sqrt(-1) is imaginary, or even 1 + 1 = 2, there's always the slight possibility that we all just suck at Maths and have made some colossal mistake regarding it.
Clearly it's so unlikely as to be completely and absolutely negligible, but the possibility is still finite.
At 3/9/08 09:08 AM, Paranoia wrote: or even 1 + 1 = 2
Because it doesn't. It equals 11.
At 3/9/08 09:08 AM, Paranoia wrote:At 3/9/08 06:58 AM, CybexALT wrote: However, if you said to me prove that sqrt(2) is irrational, or prove that a^2 + b^2 = c^2 in a right angled triangle, then I could prove that in a way that leaves no doubt in anyones mind.That's actually a pretty interesting topic in itself :P Like the testing of every combination of particles in the universe, I don't think even this could be proved conclusively. We all know that humans are fallible, so no matter how conclusively you think you've proved that sqrt(2) is irrational, or sin/cos = tan, or sqrt(-1) is imaginary, or even 1 + 1 = 2, there's always the slight possibility that we all just suck at Maths and have made some colossal mistake regarding it.
Clearly it's so unlikely as to be completely and absolutely negligible, but the possibility is still finite.
No. 1+1 is definately 2, because thats what its defined as. Maths is a man made idea. Maths was invented, not discovered.
Saying 1 + 1 might not equal 2, is like saying the word "Hello" might actually be an insult instead of a greeting. "Hello" is a greeting because someone defined it as that. It's not like someone just discovered "Hello" and we think its probably a greeting. Hello definately is a greeting, just like 1 + 1 definately equals 2 and sqrt 2 defiinately is irrational.
By the way, 1 + 1 also equals 10.
At 3/9/08 10:10 AM, CybexALT wrote: No. 1+1 is definately 2, because thats what its defined as. Maths is a man made idea. Maths was invented, not discovered.
Maths may have been a man-made idea, but that man was not you :P How can you say that your definition of 1 + 1 is the same as that of every other human being on the planet?
At 3/9/08 10:17 AM, Paranoia wrote:At 3/9/08 10:10 AM, CybexALT wrote: No. 1+1 is definately 2, because thats what its defined as. Maths is a man made idea. Maths was invented, not discovered.Maths may have been a man-made idea, but that man was not you :P How can you say that your definition of 1 + 1 is the same as that of every other human being on the planet?
Some people might have different ideas of what 1 + 1 is, but if they do, they're wrong.
1 + 1 = 2 is such a simple statement, that everyone who agrees with the statement has the same understanding of it.
And that is why much more complicated ideas can be formed with absolutely no disagreements between mathematicians. Now get your philosophy away from my maths!
At 3/9/08 10:17 AM, Paranoia wrote:At 3/9/08 10:10 AM, CybexALT wrote: No. 1+1 is definately 2, because thats what its defined as. Maths is a man made idea. Maths was invented, not discovered.Maths may have been a man-made idea, but that man was not you :P How can you say that your definition of 1 + 1 is the same as that of every other human being on the planet?
because then they would be referring to the same principle, which is why axioms are important.
we define a unary function, the successor function which takes us from one natural number to the next, it needn't be a number, you could easily use a system of symbols, where the successive symbol contains an extra line it doesnt matter.
through various other axioms, we can then define addition of natural numbers recursively as
a + 0 = a
a + S(b) = S(a+b)
where S is the unary successor function, take an example:
a + 1 = a + S(0) = S(a+0) = S(a)
hence, for any natural number, a + 1 is the successor of a, which for the set of natural numbers if a is 1, S(a) is 2, hence 1 + 1 = 2
At 3/9/08 10:31 AM, dELtaluca wrote: because then they would be referring
wouldn't**
I agree with Delta and his urinary functions.
At 3/9/08 10:29 AM, CybexALT wrote: Some people might have different ideas of what 1 + 1 is, but if they do, they're wrong.
1 + 1 = 2 is such a simple statement, that everyone who agrees with the statement has the same understanding of it.
And that is why much more complicated ideas can be formed with absolutely no disagreements between mathematicians. Now get your philosophy away from my maths!
I'm not saying that there's a serious possibility that Maths just doesn't work. It's so unlikely to be true that any sane person would disregard it. All I'm saying is that there's a finite possibility, however magnificently tiny, that everything you think you know about the universe is wrong.
At 3/9/08 10:31 AM, dELtaluca wrote: hence, for any natural number, a + 1 is the successor of a, which for the set of natural numbers if a is 1, S(a) is 2, hence 1 + 1 = 2
That's all very well and good, but it assumes all that natural number stuff to be true.
We can only take things to be true if we assume other things to be true first. For example, we can say that the world is round so long as we assume that scientists and everyone aren't involved in a massive plot to deceive us, or that the world actually exists and isn't all something we've dreamed up.
Similarly, all that Maths stuff works so long as you assume either that your mind is working perfectly well, or that the rest of the universe isn't trying to trick you into believing a load of bollocks about numbers making logical sense.
The problem with this is, there's no ultimate starting point on which you can base all your theories on the universe. Everything you see and record reaches you through your senses, and you've no reason to be 100% sure that your eyes and ears don't deceive you. Prior experience will tell you that they almost certainly don't, but even an incomprehensibly big improbability leaves a finite possibility.
At the end of the day, all you can be completely sure of is cogito ergo sum, and that doesn't help all that much.
im convinced you all will find an argument in absolutely anything. I need breakfast so i wont get sucked into this vortex.
None
At 3/9/08 10:59 AM, Paranoia wrote: That's all very well and good, but it assumes all that natural number stuff to be true.
its not a case of 'that natural number stuff' being true, that makes no sense. we DEFINE the natural numbers, and say ' these are the natural numbers, with them 1 + 1 = 2' (lol)
At 3/9/08 11:09 AM, dELtaluca wrote:At 3/9/08 10:59 AM, Paranoia wrote: That's all very well and good, but it assumes all that natural number stuff to be true.its not a case of 'that natural number stuff' being true, that makes no sense. we DEFINE the natural numbers, and say ' these are the natural numbers, with them 1 + 1 = 2' (lol)
Again, you're assuming that your understanding of natural numbers is true. It doesn't matter who's defined what - there is always a possibility that you've misunderstood them or been mislead about them. Unless you're the one and only person who defined natural numbers, and even then there's a possibility that your memory of how you defined them is flawed.
I get what you're trying to say, but that argument only applies with physics. Maths is definately correct. Even if I'm crazy, and just imagining it all, its still correct because it works. The only things that are assumed are things which I've stated are fact. You can't one day discover that its not fact, because Its something I just made up.
This is my ultimate analagy.
If I have an imaginary friend called Bob, you can't tell me that theres a small chance he's not called Bob. I invented him, so He's called Bob. From that fact, we can assume many things like:
1. I have an imaginary friend
2. My imaginary friend has a name
3. My imaginary friends name has 3 letters
4. My imaginary friends name begins with B
etc...
A physicist would go further and assume the following things:
5. My imaginary friend is male
6. My imaginary friend has a penis
and shortmonkey would go even further and assume:
7. The Sun goes round the Earth
Statements 1, 2, 3 and 4 are definately true. They assumed nothing apart from what I already stated, that I have an imaginary friend called Bob. Statements 5 and 6 might not be true. They're probably correct considering my imaginary friend is called Bob, and that is a males name, and males have penises, but it assumes things like "Bob is definately a boys name" and "I give my imaginary friend a boys name" and "all boys have penises".
And number 7 is just plain wrong.
See, my imaginary friend is maths. I invented something to be true, and the mathematicians have calculated things that are also true, the physicists have taken the maths and used it for things which are probably true, and shortmonkey has probably watched a documentary made by conspiracy theorists.
Brownie points for anyone who can think up a better analagy than that.
At 3/9/08 11:32 AM, Paranoia wrote: It doesn't matter who's defined what - there is always a possibility that you've misunderstood them or been mislead about them. Unless you're the one and only person who defined natural numbers, and even then there's a possibility that your memory of how you defined them is flawed.
That doesnt mean maths is wrong, it just means you're not very good at it.
I'm going to pick you apart again here.
At 3/9/08 11:33 AM, CybexALT wrote: This is my ultimate analagy.
If I have an imaginary friend called Bob, you can't tell me that theres a small chance he's not called Bob. I invented him, so He's called Bob. From that fact, we can assume many things like:
1. I have an imaginary friend
2. My imaginary friend has a name
3. My imaginary friends name has 3 letters
4. My imaginary friends name begins with B
etc...
Here's my two counter-analogies:
//1
1. I have a real life friend
2. He tells me that hehas an imaginary friend
3. He tells me that his imaginary friend is called Bob
Now, this imaginary friend is Maths as defined to you by other people. You can say for sure that statements two and three are correct, assuming that your friend is 100% honest. You don't know for sure that your friend is 100% honest, since everyone has a potential to lie, so you can't say for sure that statements 2 and 3 are true.
It doesn't matter that your friend is the one who defined the friend - you're relying on human honesty for your idea of it.
Now, there's that example. Now let's go onto Maths and Logic as defined by you:
//2
1. I have an imaginary friend
2. My imaginary friend is called Bob
Now, let's call back to childhood:
I can't remember anything from before I was four
The above example is proof that human memory is fallible. Not that we needed an example - I just like using the code tags. In aid of that, let's throw in another fact:
Human beings are known to deceive themselves // e.g - last page
Anyway, the point here is that while you can be almost entirely convinced that your imaginary friend is called Bob, and that you do indeed have an imaginary friend, there is a slight possibility that in the time since you 'defined' your imaginary friend, you have forgotten about him, then brought in self-deception to give yourself false memories about him.
To summarise - to disprove that a statement is 100% correct - e.g. there is zero chance that it is false - all you have to find is one single example of an instance where it is false. In this case, the counter example would be:
1. I have an imaginary pet dog called Colin
2. In the time between defining my imaginary pet and recalling information about him, I make
a mistake.
3. In an example of forgetfulness and self-deception, I replace Colin the Imaginary Dog with Bob
the Imaginary Human.
4. A statement I make, either to myself or someone else, about the character of my imaginary
pet or friend is false.
Since one counter example exists, however unlikely, the statement cannot be said to be completely and certainly true.
At 3/9/08 11:51 AM, Paranoia wrote: Since one counter example exists, however unlikely, the statement cannot be said to be completely and certainly true.
and that is not important.
what is important is that (lol analogy) if 1. is useful, then the fact that we can prove 2 and 3 makes it more useful.
we don't create maths to be the absolute truth, we create it to be useful, and the fact that we can define the natural numbers to have addition etc is useful
At 3/9/08 10:33 AM, citricsquid wrote: You can't change what 1+1 ='s and if you do you're wrong. It's surely the same as saying a chicken lays tables, when it doesn't :/
Chickens lay tables? Thats so cool.
But yeah, I agree with Luis. You guys will find argument in anything :D
Without maths 1+1 = 1+1.
No shit, Im of to go learn GLUT.
and eitherway, that doesnt apply to addition and natural numbers. because nowhere in the definition of natural numbers do we say.
'if this is true' or 'suppose that'
if i say, for each natural number, there is a successor. and you say 'but what if there isn't' i say ' well then we aren't talking about natural numbers anymore'
At 3/9/08 12:18 PM, dELtaluca wrote: if i say, for each natural number, there is a successor. and you say 'but what if there isn't' i say ' well then we aren't talking about natural numbers anymore'
Assuming of course that you're right :P
At 3/9/08 12:02 PM, dELtaluca wrote: we don't create maths to be the absolute truth, we create it to be useful, and the fact that we can define the natural numbers to have addition etc is useful
I've got no problem with you there. There isn't really a difference between 'definitely true' and 'so almost definitely true that the possibility that it isn't is negligible' as far as usefulness goes. I'm just saying that there isn't such a thing as a definite truth (other than your own existence), since none of us are known to be infallible in either reasoning or analysing the world outside of our heads.
At 3/9/08 12:53 PM, Paranoia wrote: 'm just saying that there isn't such a thing as a definite truth
everything is relative, definite truth is relative. you may aswell say that there is no such thing as the earth and that we are all just imagining it.
At 3/9/08 12:55 PM, dELtaluca wrote:At 3/9/08 12:53 PM, Paranoia wrote: 'm just saying that there isn't such a thing as a definite trutheverything is relative, definite truth is relative. you may aswell say that there is no such thing as the earth and that we are all just imagining it.
There is no such thing as the Earth. We are all just imagining it.
At 3/9/08 01:00 PM, UnknownFury wrote: There is no such thing as the Earth. We are all just imagining it.
agreed, hence all efforts are futile as we are but figments of Tom Fulp's imagination.
There are a few ways to develop the present day number system. Delta is referring to the Peano Axioms.
As humans, we 'accept' certain definitions to make communicating with one another much simpler. 2 is just some crazy symbol, and we say that crazy symbol is the same as S(1), and 3 is some crazy symbol, which is S(2). Since 2 is S(1), 3 = S(S(1)). What's S(S(S(S(S(S(S(S(S(1)))))))))? Some OTHER crazy symbol.
Integers, as another example, are really a disjoint set of numbers in which each represents the set of the "difference" (whatever THAT means... I'm glazing over a lot here) between a bunch of natural numbers. For instance, -1 could be written as an infinite set of natural numbers: { (1, 2), (2, 3), (3, 4) ... } for the relation aRb = (b - a), or something to that effect.
Addition looks like "+" and subtraction looks like "-". If you asked a programmer what "%" as a binary operator meant, he/she would say "modulus", but a Mathematician might not know what the hell you were talking about (they tend to write the word "mod"). If you then said "%" means "mod", then they could use % just as well as the programmer. If you wanted to, you could use a circle for the successor function, and a square in place of 1, and you could redevelop the whole number system with those symbols. It'd be just as meaningful, but it wouldn't act as a good platform to communicate with other people because nobody would know what the hell you were going on about.
2 doesn't MEAN 1 + 1. There's more to it than that. 1 + 1 = 2 because of the way the Peano axioms are defined (in this case! again, there are other ways...) and the accepted way we derive from there. If I made up some "Doug System" and defined 1+1=3, then 1+1=3 would be an absolutely true statement, given the "Doug System". So you don't have to believe that some symbol 1, some symbol +, some symbol 1 again, some symbol = and some symbol 2 is "true" in your own little world, but if you assume a system based on the Peano axioms, and assume 'accepted' mathematical definitions, then 1 + 1 = 2 for those assumptions.
Last, delta also mentioned proof by induction, but didn't make it explicit. It works well for numbers in N, and works by 'proving for P(1), assuming P(x), and proving for P(x+1)', where P is some statement. Proving for that case will prove for all natural numbers.
At the end of the day, it's really just a stoned-hippie argument. "Oh MAN, what if a cat was really a DOG, man!!" If you want to define cats to be dogs, that's okay. According to your definition, cats are now dogs. Nobody else has to believe you, just like you don't HAVE to accept the Peano axioms or any of modern Mathematics based on those axioms. You could make a much better argument, of course, if you understood cats and dogs very well, and could somehow try to prove that peoples' definitions of cats and dogs were flawed somehow, rather than just saying "what if" with no reasonable argument beyond that.
We should take care not to make the intellect our god; it has, of course, powerful muscles, but no personality.
I had one apple but I ate it. =(
No more apples.
Was there an apple to begin with? :O
WTH is with these stupid debates?!
Snow Days
I don't know if your workplace/school offers days off for very snowy weather but it gets really damn annoying when they don't call a snow day for the past 9 years in my city. We have PILES of snow, TONS literally, and we still have to venture out into the cold because our salary/grades depend on it.
>=(