Tymmz
Aug 29, 11:06 AM
Why do these "tree-huggers" have to interfere with business?
Apples does what they can to have more "enviornmentally-friendly" ways of processing their products. But 4th worst?
?tree-huggers? ?interfere with business? !we don't want to start that discussion!
Do you have proof for your statement, that Apple is doing their best?
Apples does what they can to have more "enviornmentally-friendly" ways of processing their products. But 4th worst?
?tree-huggers? ?interfere with business? !we don't want to start that discussion!
Do you have proof for your statement, that Apple is doing their best?
Eso
Mar 18, 09:53 AM
Sir it is perfect.
You are paying for the same thing.
I have an unlimted plan
and I never have gone over 5gb
if one has a 2gb plan and never goes over and we both surf on the internet
Tethering whats the difference?
It's easy to make the argument unlimited data plans are priced according to an average amount of data that wireless devices use. The average amount of data used while tethering can be shown to be substantially higher, resulting in higher costs, and justifying a higher price. The key is that their argument may rest upon the price of providing unlimited data. You argument rests upon the amount of data used, however in either case (whether tethered or not) users can use an unlimited amount of data.
You are paying for the same thing.
I have an unlimted plan
and I never have gone over 5gb
if one has a 2gb plan and never goes over and we both surf on the internet
Tethering whats the difference?
It's easy to make the argument unlimited data plans are priced according to an average amount of data that wireless devices use. The average amount of data used while tethering can be shown to be substantially higher, resulting in higher costs, and justifying a higher price. The key is that their argument may rest upon the price of providing unlimited data. You argument rests upon the amount of data used, however in either case (whether tethered or not) users can use an unlimited amount of data.
DeepDish
Aug 29, 11:26 AM
How come Dells last half as long? Because they're "better made"? Do they not actually function any more? Or is it that you don't throw and Apple out because of sentimentality?
The only reason we\ve dumped computers at work is because they're not worth upgrading. In the last six months that's included one dell, two PowerMac G4s (although I claimed them) and six iMac G3s. They simply weren't up to (business) task anymore. The oldest computer we have in the office is actually a Dell that we use for one program.
Not out of sentimentality. The other pcs are so cheap, sometimes it is easier to just buy a new one.
The only reason we\ve dumped computers at work is because they're not worth upgrading. In the last six months that's included one dell, two PowerMac G4s (although I claimed them) and six iMac G3s. They simply weren't up to (business) task anymore. The oldest computer we have in the office is actually a Dell that we use for one program.
Not out of sentimentality. The other pcs are so cheap, sometimes it is easier to just buy a new one.
kingtj
Apr 15, 09:59 AM
I can't speak for everyone, but I found myself torn between clicking to rate it positive, or to rate it negative. Why? Not strictly because I think there was anything wrong with someone from Apple participating in this project and contributing.... But more because in a larger, overall sense, I think the whole "bullying" thing is being blown out of proportion in recent years.
Basically, it's just the latest crusade for folks to take up, as yet another "we've gotta do anything to save the children!" move.
I'm a 40 year old adult, but I remember clearly struggling with lots of being bullied from the time I was in 1st. or 2nd. grade through the first half of high-school. I was a kid who didn't really fit in with any of the norms. I didn't like organized sports, and was really bad at playing them. I was really into science-fiction/fantasy when that was decidedly "uncool" to show any interest in. And I didn't have any clue, or care, about dressing in whichever clothing styles were considered "in style".
There was a point, during my early high-school years where I even thought about "ending it all" on a daily basis. (Only reason I didn't go through with it is because I think I was too chicken and afraid of pain to attempt it.)
Even given that background? I still can't see how all this "anti-bullying" nonsense will accomplish much? I know in my situation, every time teachers or faculty were called upon to try to "do something" about my problems, it only made matters worse. It's part of human nature that kids have mean streaks, and the only thing that's guaranteed to make a bully stop bullying you is to stand up for yourself, to his/her face. Asking OTHER people to solve the problem just escalates it, most of the time. (The faculty or teachers or even police can't guard a kid 100% of the time. Eventually, the kid(s) harassing him/her are going to corner the kid in a place where the parental figures aren't able to intervene, and it's going to get ugly -- especially since now it's about "payback" for getting those authority figures involved.)
Only 2 things ever remedied my situation. #1 was fighting back, punching a kid square in the jaw and sending him to the nurse's office, when he started chasing after me on the school playground. I earned a TON of respect that day and a whole lot of people who used to harass me backed off after that. #2 was getting older, along with my peers, and all of us simply growing out of that phase where being different was perceived as a negative.
Why on earth are people marking this as 'negative'?!?
Basically, it's just the latest crusade for folks to take up, as yet another "we've gotta do anything to save the children!" move.
I'm a 40 year old adult, but I remember clearly struggling with lots of being bullied from the time I was in 1st. or 2nd. grade through the first half of high-school. I was a kid who didn't really fit in with any of the norms. I didn't like organized sports, and was really bad at playing them. I was really into science-fiction/fantasy when that was decidedly "uncool" to show any interest in. And I didn't have any clue, or care, about dressing in whichever clothing styles were considered "in style".
There was a point, during my early high-school years where I even thought about "ending it all" on a daily basis. (Only reason I didn't go through with it is because I think I was too chicken and afraid of pain to attempt it.)
Even given that background? I still can't see how all this "anti-bullying" nonsense will accomplish much? I know in my situation, every time teachers or faculty were called upon to try to "do something" about my problems, it only made matters worse. It's part of human nature that kids have mean streaks, and the only thing that's guaranteed to make a bully stop bullying you is to stand up for yourself, to his/her face. Asking OTHER people to solve the problem just escalates it, most of the time. (The faculty or teachers or even police can't guard a kid 100% of the time. Eventually, the kid(s) harassing him/her are going to corner the kid in a place where the parental figures aren't able to intervene, and it's going to get ugly -- especially since now it's about "payback" for getting those authority figures involved.)
Only 2 things ever remedied my situation. #1 was fighting back, punching a kid square in the jaw and sending him to the nurse's office, when he started chasing after me on the school playground. I earned a TON of respect that day and a whole lot of people who used to harass me backed off after that. #2 was getting older, along with my peers, and all of us simply growing out of that phase where being different was perceived as a negative.
Why on earth are people marking this as 'negative'?!?
ideal.dreams
May 2, 09:08 PM
Just another reason for people to use Firefox. Safari is bloated in my opinion anyways.
But regardless, this is hardly a threat and I don't see what the big deal over it is. From what I can tell, this malware is downloaded on user error. Not only do you have to have Safari open "safe" files, but you also have to visit the site in order to download it, which by now I assume Safari will warn you about anyways.
If this is the result of computer geniuses trying their attempt at a Mac virus, then I'm not worried about the future security of my Mac at all.
But regardless, this is hardly a threat and I don't see what the big deal over it is. From what I can tell, this malware is downloaded on user error. Not only do you have to have Safari open "safe" files, but you also have to visit the site in order to download it, which by now I assume Safari will warn you about anyways.
If this is the result of computer geniuses trying their attempt at a Mac virus, then I'm not worried about the future security of my Mac at all.
xper
Apr 13, 07:54 AM
I will save my major comments until I see the shortcut layout, the amount of customization, and hear from the working industry . . . you know the ones too busy getting it done to attend the event. Not the ones that got paid go.
The shortcuts hasnt changed and it is possible to remap shortcuts so no need to worry.
The shortcuts hasnt changed and it is possible to remap shortcuts so no need to worry.
SRSound
Sep 26, 12:00 AM
So say I�m using my 8-core Mac Pro for CPU intensive digital audio recording. Would I be able to assign two cores the main program, two to virtual processing, two to auxiliary �re-wire� applications, and two to the general system? If so, I guess I need to hold out on my impending Mac Pro purchase!
Apple OC
Apr 24, 04:53 PM
Many people say this, but they fail at the point where actions are of culture and not representative of the religion itself.
I invite you to demonstrate how Islam is a threat to freedom and democracy.
I guess all this honour killing pretty much explains the original theory how freedom of women has been affected
thanks again edifyingG for presenting some very valid points
I invite you to demonstrate how Islam is a threat to freedom and democracy.
I guess all this honour killing pretty much explains the original theory how freedom of women has been affected
thanks again edifyingG for presenting some very valid points
stcanard
Mar 18, 12:13 PM
But it can be fixed by possibly: Encrypting (or Changing the way it is encrypted) the AAC file on the transfer from itms to the player.
or force the player to send the authorize code to apple to wrap on <i> their</i> servers before send it back to the player.
If they do the server fix it'll take more than a day.
And it will take Jon a day to figure out how the iTunes client generates that key and spoof it. Again by definition DRM has to be insecure, because the client must have all the information necessary to break it.
In interviews Steve Jobs has gone on record saying that unbreakable DRM is impossible. What you're seeing from Apple is a "good enough" strategy. After all, they don't really care, it's only there to appease the RIAA.
Does anybody have more of an idea on how the DRM wrapping is done and how the undrmed file is transfered?
There's a good overview of what's happening at Ars.
Basically the issue (and I hadn't thought about this) is that the song has to be individually encrypted for each client; that's how its made playable on your system not other people's. Because they're using Akamai to cache and distribute the files they can't distribute pre-encrypted ones! (The analogy is it would be like libraries carrying a copy of the book for everyone who might borrow it). Apple can't link everything back to their servers as you'd bottleneck it.
Instead its your copy of iTunes that's actually adding the DRM (and that's probably why the new Motorola phone won't let you buy directly from the store, it can't add the DRM).
It's an interesting problem. I would bet you will find this hole in WMA stores for the same reason. Of course Jon prefers to target the source that will get him headlines.
Apple will make another "good enough" fix to block it for another 6 months. But they really don't care. Although externally they "care", I bet internally it doesn't particularly bother them because ITMS is so big that the record companies can't afford to pull out of it.
or force the player to send the authorize code to apple to wrap on <i> their</i> servers before send it back to the player.
If they do the server fix it'll take more than a day.
And it will take Jon a day to figure out how the iTunes client generates that key and spoof it. Again by definition DRM has to be insecure, because the client must have all the information necessary to break it.
In interviews Steve Jobs has gone on record saying that unbreakable DRM is impossible. What you're seeing from Apple is a "good enough" strategy. After all, they don't really care, it's only there to appease the RIAA.
Does anybody have more of an idea on how the DRM wrapping is done and how the undrmed file is transfered?
There's a good overview of what's happening at Ars.
Basically the issue (and I hadn't thought about this) is that the song has to be individually encrypted for each client; that's how its made playable on your system not other people's. Because they're using Akamai to cache and distribute the files they can't distribute pre-encrypted ones! (The analogy is it would be like libraries carrying a copy of the book for everyone who might borrow it). Apple can't link everything back to their servers as you'd bottleneck it.
Instead its your copy of iTunes that's actually adding the DRM (and that's probably why the new Motorola phone won't let you buy directly from the store, it can't add the DRM).
It's an interesting problem. I would bet you will find this hole in WMA stores for the same reason. Of course Jon prefers to target the source that will get him headlines.
Apple will make another "good enough" fix to block it for another 6 months. But they really don't care. Although externally they "care", I bet internally it doesn't particularly bother them because ITMS is so big that the record companies can't afford to pull out of it.
arn
Sep 20, 12:50 AM
ya, seems unlikely the hard drive is for DVR functionality [as someone pointed out, there are no video inputs ont the device]... but the hard drive could prove useful in other ways.
It brings an interesting thoughts though how it complements the DVR. Wonder if Apple has thought about licensing the streaming componenet of it to Tivo, for example. It seems like it might be nice if Tivo could play protected itunes content on your home network.
Or on the flip side, Apple could license Tivo in a box of their own.
arn
It brings an interesting thoughts though how it complements the DVR. Wonder if Apple has thought about licensing the streaming componenet of it to Tivo, for example. It seems like it might be nice if Tivo could play protected itunes content on your home network.
Or on the flip side, Apple could license Tivo in a box of their own.
arn
dgree03
Apr 28, 08:47 AM
The complaint isn't that iPads aren't being included in the smart phone market. The complaint is that there is a sole focus on smart phones when comparing Android vs. iOS market share when clearly the iPad and iPod Touch are very significant portions of the iOS platform.
This is not a "smart phone" platform battle. This is a new mobile computing platform battle. But since Android has no viable competitors to the iPad or iPhone Touch, people (Fandroids and analysts alike) conveniently like to leave those devices out of the equation.
The tangible item is the smartphone hardware itself. Thats like saying the battle between Sony and Samsung LCD tv's, isnt exactly about tv's... its about Google TV(Sony) vs Samsung Smart TV.
This is not a "smart phone" platform battle. This is a new mobile computing platform battle. But since Android has no viable competitors to the iPad or iPhone Touch, people (Fandroids and analysts alike) conveniently like to leave those devices out of the equation.
The tangible item is the smartphone hardware itself. Thats like saying the battle between Sony and Samsung LCD tv's, isnt exactly about tv's... its about Google TV(Sony) vs Samsung Smart TV.
TheUndertow
Apr 10, 06:50 AM
Will never, ever happen. Do some research. Nintendo is based off from Japan, not the USA originally.
And guess who's come back from the dead?
http://blogs.wsj.com/digits/2011/04/08/commodore-64-welcome-back-old-friend/?mod=google_news_blog
What goes around, comes around. Apple can stay on for so long and sooner or later, they're bound to fall. They're human and they can't keep it up forever.
EDIT: I meant this http://www.commodoreusa.net/CUSA_TronVideo.aspx
Do some research?.....Hahahahahah.
I meant it a little in jest but i fail to see how Nintendo originating (as a trading card company amongst other things research....) from Japan would make them unable to be purchased by a US based co.
All Im saying is if Nintendo fails (which they were close to not that long ago...Gamecube) I could see their "spot" in people's living room in sync where Apple wants to be.
So far, Apple has had the foresight to anticipate market conditions and supply issues...they keep forward thinking (in process and practice), they'll be hard to beat.
And guess who's come back from the dead?
http://blogs.wsj.com/digits/2011/04/08/commodore-64-welcome-back-old-friend/?mod=google_news_blog
What goes around, comes around. Apple can stay on for so long and sooner or later, they're bound to fall. They're human and they can't keep it up forever.
EDIT: I meant this http://www.commodoreusa.net/CUSA_TronVideo.aspx
Do some research?.....Hahahahahah.
I meant it a little in jest but i fail to see how Nintendo originating (as a trading card company amongst other things research....) from Japan would make them unable to be purchased by a US based co.
All Im saying is if Nintendo fails (which they were close to not that long ago...Gamecube) I could see their "spot" in people's living room in sync where Apple wants to be.
So far, Apple has had the foresight to anticipate market conditions and supply issues...they keep forward thinking (in process and practice), they'll be hard to beat.
snoopy
Oct 12, 11:41 AM
Originally posted by benixau
for crying out load, who cares if a pc can do its sums better than a mac. . . . . if i am more productive on my mac then it doesnt matter that it might be a little 'slower' . . .
True for many of us. For applications that use a lot of math functions, it makes a big difference. So, for others it does matter. They may be in the minority, but a very important group of users. In less than a year the picture will change, and that small group will be very pleased with the Mac. For now, there is nothing anyone can do about it.
for crying out load, who cares if a pc can do its sums better than a mac. . . . . if i am more productive on my mac then it doesnt matter that it might be a little 'slower' . . .
True for many of us. For applications that use a lot of math functions, it makes a big difference. So, for others it does matter. They may be in the minority, but a very important group of users. In less than a year the picture will change, and that small group will be very pleased with the Mac. For now, there is nothing anyone can do about it.
AP_piano295
Apr 23, 12:43 AM
No one is concluding that there was a single "bang," and I'm certainly not conflating anything. "Bang" is a metaphor, and no one is relating it to the "origin of life." You're trying inflate your own ego and place your "scientific literacy" on display here by arguing a point that no one is questioning.
It certainly seems that you are questioning the point.
You raised the point that it is/was illogical for me to believe that the life and the universe appeared in a sudden "bang". And you claimed that such a belief could not be possibly based in logic :rolleyes:.
Of course I never purported to believe any such thing, rather you simply implied that this is what I believe.
In my original post I never claimed to understand or remotely fathom how the universe and life came to exist. But the fact that I do not know how our universe came to be has very little baring on this conversation.
I have very little understanding of how the computer I am currently using ACTUALLY works. Yet work it does, it does not work through the grace of god but rather through marvels of modern engineering and achievements in scientific understanding.
Your god of the gaps is simply a dark room waiting for someone to turn on the light.
It certainly seems that you are questioning the point.
You raised the point that it is/was illogical for me to believe that the life and the universe appeared in a sudden "bang". And you claimed that such a belief could not be possibly based in logic :rolleyes:.
Of course I never purported to believe any such thing, rather you simply implied that this is what I believe.
In my original post I never claimed to understand or remotely fathom how the universe and life came to exist. But the fact that I do not know how our universe came to be has very little baring on this conversation.
I have very little understanding of how the computer I am currently using ACTUALLY works. Yet work it does, it does not work through the grace of god but rather through marvels of modern engineering and achievements in scientific understanding.
Your god of the gaps is simply a dark room waiting for someone to turn on the light.
Will_reed
Jul 11, 10:12 PM
I wonder if this will be good enough to cut my 4k footage off my yet to purchase red camera. How ever I think the quad g5 would be enough.
C N Reilly
Mar 18, 12:59 PM
I'm not worried about this. There's only two possibilities:
1) AT&T is just assuming anyone who uses more than X amount of data must be tethering, and shooting out threats. In such a case, all you have to do is call them and tell them you stream a radio station all day. They take you off the "evil tetherer" list; end of problem. (I've already seen two people post elsewhere that this has worked for them.)
2) There actually is something in the software/firmware that's enabling AT&T to tell who's tethering. In this case, the jailbreakers will just add some code to the next release to block or fool that bit of code. End of problem.
All signs thus far point to (1) being the truth, btw.
1) AT&T is just assuming anyone who uses more than X amount of data must be tethering, and shooting out threats. In such a case, all you have to do is call them and tell them you stream a radio station all day. They take you off the "evil tetherer" list; end of problem. (I've already seen two people post elsewhere that this has worked for them.)
2) There actually is something in the software/firmware that's enabling AT&T to tell who's tethering. In this case, the jailbreakers will just add some code to the next release to block or fool that bit of code. End of problem.
All signs thus far point to (1) being the truth, btw.
silentnite
May 4, 11:49 AM
Safari is not set as a default for me & I only use it if Mozilla is stalling but this is only the beginning for apple with it's continued success comes a lot of security issues for the future.
dante@sisna.com
Sep 12, 07:07 PM
Please explain to me, even hypothetically, how this could be a Tivo killer DVR. As a basis for the argument, consider that TiVo (as of today) can record 2 HD channels simulteously, while watching a third previously recorded show. Plus you can pause live TV.
Elgato and Myth and all of the cable & satellite Co. DVRs haven't been able to compete with TiVo to date, what makes you thik they will be able to going forward?
How does Elgato not compete?
Sure it does:
1) I can pause mine.
2) I have a full software based one-click scheduling system
3) I can record high def content.
4) If I use two cards, I can record two streams via a signal splitter.
5) I can certainly watch a prerecorded show while doing all of the above: my Quad Core easily handles this.
Elgato and Myth and all of the cable & satellite Co. DVRs haven't been able to compete with TiVo to date, what makes you thik they will be able to going forward?
How does Elgato not compete?
Sure it does:
1) I can pause mine.
2) I have a full software based one-click scheduling system
3) I can record high def content.
4) If I use two cards, I can record two streams via a signal splitter.
5) I can certainly watch a prerecorded show while doing all of the above: my Quad Core easily handles this.
Backtothemac
Oct 7, 10:32 AM
These test that this guy puts up are crap! The Athlon is overclocked to be a 2100+, none of the systems have the most current OS. I personally have seen great variations in his tests over the years, and personally, I don't buy it. Why test for single processor functions? The Dual is a DUAL! All of the major Apps are dual aware, as is the OS!
Try that with XP Home.
Try that with XP Home.
jiggie2g
Jul 12, 05:29 PM
jiggy:
your thinking is exactly why most pc's suck, dell ect choose components that are "good enough" or choose some unsuitable cpu because it sounds fast, woodcest makes the most sense to go into the mac pro, conroe into the imac merom into the mbp simple as.
just because something is not for you does not mean how you want it is how it should be, your a kid who likes playing with pc hardware and likes components with "big numbers" and overclockability, and while a quad would be wasted on you it'd be great for people who actually buy mac pro's/powermacs.
you give pc users a bad name it's not the other way around.
Oh and Apple dosen't go to Samsung and Micron for it's ram like everyone else , or Pioneer/Toshiba/Matsushita for the DVD Burner , how bout Maxtor/Seagate for the Hard drives , Apple dosen't go to Samsung/LGPhillips for it's LCD Panels just like Dell and HP. now Intel for it's CPU/NorthBridge chipsets. c'mon it called a con they all shop at the same store dude. Newegg..lol
the only thing Apple about ur mac will be the Pretty case and OSX. Other then that it's just another PEECEE.
your thinking is exactly why most pc's suck, dell ect choose components that are "good enough" or choose some unsuitable cpu because it sounds fast, woodcest makes the most sense to go into the mac pro, conroe into the imac merom into the mbp simple as.
just because something is not for you does not mean how you want it is how it should be, your a kid who likes playing with pc hardware and likes components with "big numbers" and overclockability, and while a quad would be wasted on you it'd be great for people who actually buy mac pro's/powermacs.
you give pc users a bad name it's not the other way around.
Oh and Apple dosen't go to Samsung and Micron for it's ram like everyone else , or Pioneer/Toshiba/Matsushita for the DVD Burner , how bout Maxtor/Seagate for the Hard drives , Apple dosen't go to Samsung/LGPhillips for it's LCD Panels just like Dell and HP. now Intel for it's CPU/NorthBridge chipsets. c'mon it called a con they all shop at the same store dude. Newegg..lol
the only thing Apple about ur mac will be the Pretty case and OSX. Other then that it's just another PEECEE.
javajedi
Oct 11, 06:30 PM
Originally posted by javajedi
What you are saying makes a lot of sense. Now that I think about, I too recall reading this somewhere.
Now that we know the real truth about the "better standard FPU", I thought it was time to shed some light on non vectorized G4 integer processing.
It still does 200,000,000 calculations, but this time I'm multiplying ints.
Motorola 7455 G4@800Mhz: 9 seconds (Native)
IBM 750FX G3@700Mhz: 7 seconds (Native)
Intel P4@2600Mhz 2 seconds (Java)
PowerPC 7455 integer processing is consierabley better than floating point (obviously less work doing ints), but still less per cycle than the Pentium 4.
Very intresting the G4 looses both floating point and integer to the IBM chip, at a 100MHz clock disadvantage.
I'm still waiting to see that "better standard FPU" in the G4. It seems the G4 is absolutely useless unless you are fortunate to have vectorized (AltiVec) code.
Alex, yeah, the native version was compiled under 3.1. It really is interesting to note that despite the 750FX's 100MHz clock disadvantage, it is able to outperform it by 22%. Since there is a 13% difference in clock speed, and if clocks were equal, the 750FX is technically 25% more efficient in scalar integer. I should also re-emphasize that I never bothered compiling the test natively for x86, I left it java, so it's not out of the question the P4 could do this in 1 second - and that is *NOT* using any vector libraries, just plain old integer math.
I've found some documentation on the Altivec C programming interface, and this weekend I'm going to make a first attempt at vectorizing it. The integer test should be no problem, but my FPMathTest app that did square roots will be more difficult. With Altivec, there is not recognized double precision floating point, so this complicates doing square roots. If you want more accurate, precision square roots, you have to do Newton Raphson refinement. In other words more ************ you have to go through. I believe in SSE2 you have double precision floating point ops, and if you were to vectorize it, you wouldn't have to compensate for this.
Another theory as to why the P4 is scoring so good is because if I'm not mistaking (and I'm not), the P4's ALU runs at double its clock. So in my case, 5.6GHz. I'm sure this relates to the issue.
I don't know how true this is, but I wouldn't be suprised if there is some truth to it, surely some food for thought:
http://www.osopinion.com/perl/story/17368.html
The G4 was just a hacked-up G3 with AltiVec and an FPU (floating point unit) borrowed from the outdated 604
If this is the case, then no wonder why we are getting these abysmal scores, and no wonder why a 400mhz Celeron can nearly equal it, and no wonder why the 750FX can outperform it (different company, different fpu)
What you are saying makes a lot of sense. Now that I think about, I too recall reading this somewhere.
Now that we know the real truth about the "better standard FPU", I thought it was time to shed some light on non vectorized G4 integer processing.
It still does 200,000,000 calculations, but this time I'm multiplying ints.
Motorola 7455 G4@800Mhz: 9 seconds (Native)
IBM 750FX G3@700Mhz: 7 seconds (Native)
Intel P4@2600Mhz 2 seconds (Java)
PowerPC 7455 integer processing is consierabley better than floating point (obviously less work doing ints), but still less per cycle than the Pentium 4.
Very intresting the G4 looses both floating point and integer to the IBM chip, at a 100MHz clock disadvantage.
I'm still waiting to see that "better standard FPU" in the G4. It seems the G4 is absolutely useless unless you are fortunate to have vectorized (AltiVec) code.
Alex, yeah, the native version was compiled under 3.1. It really is interesting to note that despite the 750FX's 100MHz clock disadvantage, it is able to outperform it by 22%. Since there is a 13% difference in clock speed, and if clocks were equal, the 750FX is technically 25% more efficient in scalar integer. I should also re-emphasize that I never bothered compiling the test natively for x86, I left it java, so it's not out of the question the P4 could do this in 1 second - and that is *NOT* using any vector libraries, just plain old integer math.
I've found some documentation on the Altivec C programming interface, and this weekend I'm going to make a first attempt at vectorizing it. The integer test should be no problem, but my FPMathTest app that did square roots will be more difficult. With Altivec, there is not recognized double precision floating point, so this complicates doing square roots. If you want more accurate, precision square roots, you have to do Newton Raphson refinement. In other words more ************ you have to go through. I believe in SSE2 you have double precision floating point ops, and if you were to vectorize it, you wouldn't have to compensate for this.
Another theory as to why the P4 is scoring so good is because if I'm not mistaking (and I'm not), the P4's ALU runs at double its clock. So in my case, 5.6GHz. I'm sure this relates to the issue.
I don't know how true this is, but I wouldn't be suprised if there is some truth to it, surely some food for thought:
http://www.osopinion.com/perl/story/17368.html
The G4 was just a hacked-up G3 with AltiVec and an FPU (floating point unit) borrowed from the outdated 604
If this is the case, then no wonder why we are getting these abysmal scores, and no wonder why a 400mhz Celeron can nearly equal it, and no wonder why the 750FX can outperform it (different company, different fpu)
NebulaClash
Apr 28, 01:26 PM
Personally, I very VERY much hope Apple do allow the iPad to grow into a fully independent device and break it's lock down link to iTunes.
Unfortunately, seeing as the iTunes link is Apple's money making link, I cannot see them allowing this to happen for a long time, meaning it will never grow to it's full potential as a fully independent device.
Well, in the future I'm talking about involving cloud computing, the link will be there but it will be over the air. But it seems you are talking about not having any link to iTunes. But then what do you want to link it to? The Android app market? Cydia? I mean, you need to have some place to link it to in order to hook into the world of apps (plus backups, etc.) Even our PCs are not standalone by that definition, basically needing a Net connection to get much done.
So what is an independent device to you? Independent of what?
Unfortunately, seeing as the iTunes link is Apple's money making link, I cannot see them allowing this to happen for a long time, meaning it will never grow to it's full potential as a fully independent device.
Well, in the future I'm talking about involving cloud computing, the link will be there but it will be over the air. But it seems you are talking about not having any link to iTunes. But then what do you want to link it to? The Android app market? Cydia? I mean, you need to have some place to link it to in order to hook into the world of apps (plus backups, etc.) Even our PCs are not standalone by that definition, basically needing a Net connection to get much done.
So what is an independent device to you? Independent of what?
sinsin07
Apr 9, 01:19 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Mobile/8G4)
benixau
Oct 12, 03:22 PM
If you want to get exceptional mathematical performance then why are you getting a micro computer???? I cannot out-type my computer and i cannot do mathematical functions fater than it, or even excel with all of its overhead.
BTW, my g4 is soooo slow at doing maths functions that i finished an assignment a whole 5mins ahead of a mate. In excel. these were some serious slowdown stuff, 10 cross-referenced, dependently linked, nested functions sheets. Now my mac only has 2 867s with 256ddr, his p4 2.53 with 512 couldnt beat me, WITH WIN95.
Now any more real world tests you would like????:D
BTW, my g4 is soooo slow at doing maths functions that i finished an assignment a whole 5mins ahead of a mate. In excel. these were some serious slowdown stuff, 10 cross-referenced, dependently linked, nested functions sheets. Now my mac only has 2 867s with 256ddr, his p4 2.53 with 512 couldnt beat me, WITH WIN95.
Now any more real world tests you would like????:D
No comments:
Post a Comment