epitaphic
Sep 13, 10:53 AM
What about Tigerton (2007)? Isn't that a "true" quad?
Intel has two lines of Xeon processors:
* The 5000 series is DP (dual processor, like Woodcrest, Clovertown)
* The 7000 series MP (multi processor - eg 4+ processors)
Tigerton is supposed to be an MP version of Clovertown. Meaning, you can have as many chips as the motherboard supports, and just like Clovertown its an MCM (two processors in one package). 7000's are also about 5-10x the price of 5000's.
So unless the specs for Tigerton severely change, no point even considering it on a Mac Pro (high end xserve is plausible).
Intel has two lines of Xeon processors:
* The 5000 series is DP (dual processor, like Woodcrest, Clovertown)
* The 7000 series MP (multi processor - eg 4+ processors)
Tigerton is supposed to be an MP version of Clovertown. Meaning, you can have as many chips as the motherboard supports, and just like Clovertown its an MCM (two processors in one package). 7000's are also about 5-10x the price of 5000's.
So unless the specs for Tigerton severely change, no point even considering it on a Mac Pro (high end xserve is plausible).
8CoreWhore
Mar 22, 02:55 PM
Why do they call their tablet a "book". Just stupid.
DaveTheGrey
Aug 17, 03:55 AM
did you say, "die power pc, die"?
no that's german for:
(sideshow bob)The Power PC...The!!!(/sideshow bob)
the jury: "no one who speaks german can be an evil man" rofl
no that's german for:
(sideshow bob)The Power PC...The!!!(/sideshow bob)
the jury: "no one who speaks german can be an evil man" rofl
janmike34
Apr 11, 03:37 PM
If we're waiting until September for PRODUCTION, then I think we'll see something great in the late fall or early winter.
I just want a leap with iOS 5. My take on notifications:
http://www.youtube.com/watch?v=BqWO6VkJh-0
I just want a leap with iOS 5. My take on notifications:
http://www.youtube.com/watch?v=BqWO6VkJh-0
evilgEEk
Aug 11, 06:08 PM
You might want to read some reviews on the Chocolate before buying it. I've seen a handful of reviews that were less than favorable.
A good resource is Phonescoop.com. They usually have a decent amount of user reviews upon which you can base your purchasing decisions.
A good friend of mine just bought one and it definitely seems to be the best phone Verizon is currently offering. I also want a slider, I'm tired of flip phones, although if the iPhone was a flip I'm sure I could handle a flip phone for a few more years. ;)
Verizon's phone selection is just horrible, but that's not enough for me to change services. Especially since I've personally never had a single problem with Verizon coverage or customer service, they've always been great.
I think I'm going to hold off for a month or so, just because I don't have the cash to get a Chocolate and then an iPhone. And if an iPhone is released, I'll obviously have to get one. ;)
EDIT: Thanks for the link though, I'll definitely check it out. :)
A good resource is Phonescoop.com. They usually have a decent amount of user reviews upon which you can base your purchasing decisions.
A good friend of mine just bought one and it definitely seems to be the best phone Verizon is currently offering. I also want a slider, I'm tired of flip phones, although if the iPhone was a flip I'm sure I could handle a flip phone for a few more years. ;)
Verizon's phone selection is just horrible, but that's not enough for me to change services. Especially since I've personally never had a single problem with Verizon coverage or customer service, they've always been great.
I think I'm going to hold off for a month or so, just because I don't have the cash to get a Chocolate and then an iPhone. And if an iPhone is released, I'll obviously have to get one. ;)
EDIT: Thanks for the link though, I'll definitely check it out. :)
gkp
Jun 17, 11:43 AM
I got to my local RS at 8am, nobody there, left and came back around 8:45, there were a few people there. The manager was on his cell phone listening to a conference call. After the call finished, he said that it was decided that they are NOT taking pre-orders, but only reservations. He took our names and info and said he would call later in the day with "Pin Numbers" that were assigned to their store. He also said that he could not reserve any iPhones in their system/computer until 10am. So, basically what happened is another store entered their reservations before the 10am assigned time and took up all the pin numbers for OUR area. (This cheating store is in Sacramento, Ca.)
So, later in the day, I called and the manager said that they could only hope for some iPhones to be sent to our store and if so, they would keep ours aside for us (first come, first serve). But, he said the likelyhood of this happening looks grim.
Why did Apple/RadioShack even bother? Even the manager told me the whole process was screwed up.
So, later in the day, I called and the manager said that they could only hope for some iPhones to be sent to our store and if so, they would keep ours aside for us (first come, first serve). But, he said the likelyhood of this happening looks grim.
Why did Apple/RadioShack even bother? Even the manager told me the whole process was screwed up.
gerrycurl
Jul 14, 05:59 PM
the question still remains--will the powermacs be able to use standard, off the shelf, pc video cards?
i know that you couldn't do so in the power architecture due to the bios irregularities. now that they're using efi, does this still mean we have to buy mac based cards? because that's really the question nobody seems to ask and nobody seems to have an answer for.
what this new mac workstation will mean is the chance to upgrade your macs based on commodity parts. no more mac tax for hardware. i remember when the radeon 9700 was king, the price was around $299 for pc version and $399 for mac version.
think about this, the ability to upgrade processor, video card, and sound card without having to pay the apple tax.
that's what it really comes down to. the speculative "good" version of the mac pro has a so-so video card, but it's not really worth the $600 more just to get a 1800, i'd rather just get the 1600 and upgrade on my own.
i know that you couldn't do so in the power architecture due to the bios irregularities. now that they're using efi, does this still mean we have to buy mac based cards? because that's really the question nobody seems to ask and nobody seems to have an answer for.
what this new mac workstation will mean is the chance to upgrade your macs based on commodity parts. no more mac tax for hardware. i remember when the radeon 9700 was king, the price was around $299 for pc version and $399 for mac version.
think about this, the ability to upgrade processor, video card, and sound card without having to pay the apple tax.
that's what it really comes down to. the speculative "good" version of the mac pro has a so-so video card, but it's not really worth the $600 more just to get a 1800, i'd rather just get the 1600 and upgrade on my own.
CaoCao
Feb 28, 10:02 PM
I seem to recall you agreeing with this post:
And by "living with" I mean having sex and having a family as well.
Yes, I did agree with that post. What is your point?
And yet you seem quite certain how the human brain works and what is normal/ not normal. :rolleyes:
My original point was that you made an assertive, sweeping generalization without any backup. Just a very matter-of-fact "Hey, all you humans, here is how your body was designed. All you gays, you are not the default. Trust me, I'm from teh internetz."
It's clumsy and insensitive at best, and just more religion-based trolling at worst.
Heterosexuality is by definition normal (conforming to a standard; usual, typical, or expected). What percentage of the population is homosexual, what percentage is heterosexual?
Humans by default have four fingers and a thumb on each hand. Am I being mean to people with more or fewer fingers? No, just stating a fact.
And by "living with" I mean having sex and having a family as well.
Yes, I did agree with that post. What is your point?
And yet you seem quite certain how the human brain works and what is normal/ not normal. :rolleyes:
My original point was that you made an assertive, sweeping generalization without any backup. Just a very matter-of-fact "Hey, all you humans, here is how your body was designed. All you gays, you are not the default. Trust me, I'm from teh internetz."
It's clumsy and insensitive at best, and just more religion-based trolling at worst.
Heterosexuality is by definition normal (conforming to a standard; usual, typical, or expected). What percentage of the population is homosexual, what percentage is heterosexual?
Humans by default have four fingers and a thumb on each hand. Am I being mean to people with more or fewer fingers? No, just stating a fact.
oingoboingo
Aug 17, 03:23 AM
But it's not faster. Slower actually than the G5 at some apps. What's everyone looking at anyway? I'm pretty unimpressed. Other than Adobe's usage of cache (AE is a cache lover and will use all of it, hence the faster performance).
But the actual xeon processors are only as fast as the G5 processors. Look at the average specs... the 2.66 machines are only a teeny bit faster than the G5s except in a few apps like filemaker. But not in the biggies like Final Cut Pro where it actually appears that mhz for mhz the G5 is a faster machine hands down!
I guess one extra thing to consider if you're taking that point of view is that the Quad 2.5GHz G5 costs US $3299 with 512MB RAM, and the Quad 2.66GHz MacPro only costs US $2499 with 1GB RAM, plus a superior case design. Even if the MacPro is only the same speed as the Quad G5, it's substantially cheaper.
And that can't be a bad thing.
But the actual xeon processors are only as fast as the G5 processors. Look at the average specs... the 2.66 machines are only a teeny bit faster than the G5s except in a few apps like filemaker. But not in the biggies like Final Cut Pro where it actually appears that mhz for mhz the G5 is a faster machine hands down!
I guess one extra thing to consider if you're taking that point of view is that the Quad 2.5GHz G5 costs US $3299 with 512MB RAM, and the Quad 2.66GHz MacPro only costs US $2499 with 1GB RAM, plus a superior case design. Even if the MacPro is only the same speed as the Quad G5, it's substantially cheaper.
And that can't be a bad thing.
Timepass
Jul 15, 10:57 AM
I disagree. Using ATX power supplies is a stupid idea. I am sure Apple uses higher quality power supplies than you would pick up at your local CompUSA.
If they allow this there will be a lot of dead Macs, from power supplies whose rails aren't strong enough.
Not to mention those who buy the 400W model because it is only 20 bucks and drastically underpower there Mac.
This would cause too many problems. Keep it proprietary IMO.
Well I wouldnt worry about that in the case of a mac. Only people who are really going to replace there PSU are going to be people who know something about computers. A lot of people replace there ram. PSU are not upgraded very offen if ever at all.
Also the people who do replace PSU most of them know dont cheap out on them. Among home builder comminty a thing most agree on is NEVER cheap out on a PSU. Go name brand. Reason being is why would you build a 1k system and then risk it all with a cheap PSU (rule can be cut if pretty much using dirt cheap parts to begin with and trying to go as cheaply as possible (less than 500 and in old spare parts). My own PC rig using an Antec True Power PSU in it (that i pick up from compUSA oddly enough).
I think going ATX is a good thing because it means Apple is going to be using more standardized parts so it will be cheaper for apple to get them.
If they allow this there will be a lot of dead Macs, from power supplies whose rails aren't strong enough.
Not to mention those who buy the 400W model because it is only 20 bucks and drastically underpower there Mac.
This would cause too many problems. Keep it proprietary IMO.
Well I wouldnt worry about that in the case of a mac. Only people who are really going to replace there PSU are going to be people who know something about computers. A lot of people replace there ram. PSU are not upgraded very offen if ever at all.
Also the people who do replace PSU most of them know dont cheap out on them. Among home builder comminty a thing most agree on is NEVER cheap out on a PSU. Go name brand. Reason being is why would you build a 1k system and then risk it all with a cheap PSU (rule can be cut if pretty much using dirt cheap parts to begin with and trying to go as cheaply as possible (less than 500 and in old spare parts). My own PC rig using an Antec True Power PSU in it (that i pick up from compUSA oddly enough).
I think going ATX is a good thing because it means Apple is going to be using more standardized parts so it will be cheaper for apple to get them.
maclaptop
Apr 13, 03:26 PM
1) I'm perfectly happy with the data speeds I get on AT&T 3G. I would guess the new 4G phones will suffer in battery life. I don't want to give up battery life for network speed I don't really need. If I had to choose I would choose battery life every time.
2) It's not the cost of the phone, its the cost of the data plan. I would guess it will be like the iPhone 3G launch where AT&T forced you into a 3G plan even if you didn't have 3G coverage in your area.
3) I currently have unlimited data with AT&T which I would like to keep. I doubt very seriously this will be an option with the new "4G" network plans.
4) I can wait for a "4G" phone until there is decent "4G" coverage.
1) Me too
2) I Agree
3) I'm sure you're right
4) Me too
Great post :)
2) It's not the cost of the phone, its the cost of the data plan. I would guess it will be like the iPhone 3G launch where AT&T forced you into a 3G plan even if you didn't have 3G coverage in your area.
3) I currently have unlimited data with AT&T which I would like to keep. I doubt very seriously this will be an option with the new "4G" network plans.
4) I can wait for a "4G" phone until there is decent "4G" coverage.
1) Me too
2) I Agree
3) I'm sure you're right
4) Me too
Great post :)
Le Big Mac
Apr 27, 08:27 AM
And here I thought that data wasn't sent to Apple? At least they encrypted it so that you can't tell what actually is sent.
How much is it costing me to send the data to apple so they can crowdsource locations for everyone? I doubt AT&T isn't counting this towards data use.
How much is it costing me to send the data to apple so they can crowdsource locations for everyone? I doubt AT&T isn't counting this towards data use.
dscottbuch
Apr 25, 03:05 PM
"a perfect storm", "overreaction", "typical for the us to sue.."
... sorry, but in what ways do I benefit by having apple track my whereabouts to the day and meter? why isn't there an opt-in (apart from the general 'eat **** or die' TOU) or at least an opt-out for this? why is it so easy to access the data?
... apple deserves to get a beating for this.
they're known for focussing on the user in terms of design and UI of theirdevices... they should also make the step to focus on their users best interest in terms of privacy and freedom, rather than their own greed.
Perfect example of 'journalists' not taking the time to explain what is really happening and then 'reader' not trying to understand. Apple receives NONE of this information. No One receives any of this information. Its simply another file on your phone. Should they (Apple) fix this - YES. Is anyone aware of ANY harm done to ANY person by this (other than the catch-all psychological harm which can't really be quantified) - I doubt it.
Even the theory that this could be used against you by law enforcement is flawed as I would bet that collection of this data by a law enforcement agency would be prohibited as it was NOT opted into by the user.
There is NO HARM here to actually litigate - so the conclusion that the lawyers are money grubbers.
... sorry, but in what ways do I benefit by having apple track my whereabouts to the day and meter? why isn't there an opt-in (apart from the general 'eat **** or die' TOU) or at least an opt-out for this? why is it so easy to access the data?
... apple deserves to get a beating for this.
they're known for focussing on the user in terms of design and UI of theirdevices... they should also make the step to focus on their users best interest in terms of privacy and freedom, rather than their own greed.
Perfect example of 'journalists' not taking the time to explain what is really happening and then 'reader' not trying to understand. Apple receives NONE of this information. No One receives any of this information. Its simply another file on your phone. Should they (Apple) fix this - YES. Is anyone aware of ANY harm done to ANY person by this (other than the catch-all psychological harm which can't really be quantified) - I doubt it.
Even the theory that this could be used against you by law enforcement is flawed as I would bet that collection of this data by a law enforcement agency would be prohibited as it was NOT opted into by the user.
There is NO HARM here to actually litigate - so the conclusion that the lawyers are money grubbers.
LegendKillerUK
Apr 6, 02:34 PM
That's a common misreading of what Jobs said.
iOS was developed for the phone first.
As Jobs explained, there was a simple UI demo done on a touch device originally designed to be a keyboard input prototype. That demo gave him the idea to go all touch on the iPhone. That's what he meant by "the tablet came first".
Since we know that during summer/fall the first iPhone UI concepts were done using iPods with wheels, his touch "eureka" moment probably came in late with the UI demo almost certainly done under OSX.
According to all known histories, the actual creation of iOS didn't begin until 2006. Prior to that, some at Apple were still proposing using Linux for the phone OS.
But he then said after how well it would work on the phone, they put the tablet project on the shelf and focused on the phone as it was more important. Which means it was a tablet and no just a touch screen device in the beginning.
iOS was developed for the phone first.
As Jobs explained, there was a simple UI demo done on a touch device originally designed to be a keyboard input prototype. That demo gave him the idea to go all touch on the iPhone. That's what he meant by "the tablet came first".
Since we know that during summer/fall the first iPhone UI concepts were done using iPods with wheels, his touch "eureka" moment probably came in late with the UI demo almost certainly done under OSX.
According to all known histories, the actual creation of iOS didn't begin until 2006. Prior to that, some at Apple were still proposing using Linux for the phone OS.
But he then said after how well it would work on the phone, they put the tablet project on the shelf and focused on the phone as it was more important. Which means it was a tablet and no just a touch screen device in the beginning.
walterwhite
Apr 25, 01:54 PM
Lawyers never seem to see or feel the Karma stick for nonsensical and litigious lawsuits that just end up effecting the rest of us... that do our best to be good human beings.
toolbox
Mar 26, 06:33 AM
Good stuff, waiting and ready to pay! :o
Same! soon as available for pre order / order
Same! soon as available for pre order / order
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
Zadillo
Aug 27, 05:28 PM
I see where you're coming from.
So does this mean there will be no Powerbook G5s next tuesday?
Hey, you never know.... ;)
So does this mean there will be no Powerbook G5s next tuesday?
Hey, you never know.... ;)
deconai
Aug 11, 03:47 PM
Yes. EVERYONE. If you dont believe me, maybe you believe the economist:
http://www.economist.com/printedition/displayStory.cfm?Story_ID=4351974
Please note that the graph is about three years old. Nowadays a lot more of the countries are over 100%.
That is insane! It's interesting to note the number of people with multiple phones...
http://www.economist.com/printedition/displayStory.cfm?Story_ID=4351974
Please note that the graph is about three years old. Nowadays a lot more of the countries are over 100%.
That is insane! It's interesting to note the number of people with multiple phones...
terkans
Jul 20, 11:56 AM
yes, its known as reverse hyper threading. AMD are working on it
http://www.dvhardware.net/article10901.html
um, no:
http://arstechnica.com/news.ars/post/20060713-7263.html
http://www.dvhardware.net/article10901.html
um, no:
http://arstechnica.com/news.ars/post/20060713-7263.html
lorductape
Nov 28, 06:39 PM
I suspect the main reason that Microsoft agreed to pay money in the first place is that they needed to get the music labels on board to boost the Zune Music Store, Microsoft was in the weaker position here and I believe the labels exploited that weakness.
I believe, correct me if I'm wrong, that microsoft suggested it in the first place to universal.
I believe, correct me if I'm wrong, that microsoft suggested it in the first place to universal.
braddouglass
Apr 6, 12:56 PM
A hard drive uses less than 2 Watts while reading or writing. Flash uses the same or more when it is used; it only has an advantage when it is not used, where the hard disk drive has to spend energy to keep the drive spinning (less than 1 Watt).
So I suppose that standby temp would be low. and that operation temp would be about the same as any other lap top. Sounds good to me haha.
All I want is a faster processor and a backlit keyboard and I'll be happy with it.
Already with Flash HD and 4GB ram it should be wicked fast, but I'd like an i5 at least...
So I suppose that standby temp would be low. and that operation temp would be about the same as any other lap top. Sounds good to me haha.
All I want is a faster processor and a backlit keyboard and I'll be happy with it.
Already with Flash HD and 4GB ram it should be wicked fast, but I'd like an i5 at least...
shelterpaw
Aug 11, 11:14 AM
What I gather would really make the iPhone something special:
PhantomPumpkin
Apr 27, 10:49 AM
Apple identified it? No. Check your history. It was brought TO Apple's attention over a year ago.
It was again brought TO Apple's attention via various reports and articles.
THEN Apple looked into the matter.
I commend Apple for taking action (now).
But let's not rewrite history, shall we?
You're just misinterpreting what I was saying. They identified it as a potential issue, instead of saying "there's nothing wrong, we're not going to do a darned thing." I wasn't saying the brought it up to the media's attention on their own.
Nitpicking, is well, nitpicky?
It was again brought TO Apple's attention via various reports and articles.
THEN Apple looked into the matter.
I commend Apple for taking action (now).
But let's not rewrite history, shall we?
You're just misinterpreting what I was saying. They identified it as a potential issue, instead of saying "there's nothing wrong, we're not going to do a darned thing." I wasn't saying the brought it up to the media's attention on their own.
Nitpicking, is well, nitpicky?