heehee
Aug 26, 04:53 PM
Can't wait unti it comes out. My "work" is getting me a Mac Pro, but I want to wait until this comes out and decide if i should get the Mac Pro or the new Macbook Pro. :cool:
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
The Beatles
Apr 25, 03:16 PM
Asinine
how did they think the location based features on any app worked? This is just a cashed file for those purposes.
And what about all the location based advertising? So it takes this to make people understand that the world has changed? This is old news and ridiculous that people are now making a scene about it. How about signing electronically at a credit card purchase machine. How about giving someone a check with your account number on the bottom of it. How about electronically giving your personal and sensitive info over the internet.
This is how it is people. You bought in to it a long time ago. Its what it takes to move forward. And the only reason why this is a bad thing is because people fail to police themselves. Including the people that attain this info, and thats why we will eventually have some negative repercussion from this collection of data.
But to pin point apple and create a federal case out of something that the government already new was happening is ridiculous.
how did they think the location based features on any app worked? This is just a cashed file for those purposes.
And what about all the location based advertising? So it takes this to make people understand that the world has changed? This is old news and ridiculous that people are now making a scene about it. How about signing electronically at a credit card purchase machine. How about giving someone a check with your account number on the bottom of it. How about electronically giving your personal and sensitive info over the internet.
This is how it is people. You bought in to it a long time ago. Its what it takes to move forward. And the only reason why this is a bad thing is because people fail to police themselves. Including the people that attain this info, and thats why we will eventually have some negative repercussion from this collection of data.
But to pin point apple and create a federal case out of something that the government already new was happening is ridiculous.
digitalbiker
Aug 25, 03:31 PM
Over the years I have bought a lot of computers for my business from a lot of different venders. To be honest Apple hardware support has never impressed me! :mad: I have actually had much better support from Dell than from Apple.
As far as .Mac goes it is one of the most poorly supported systems I have ever used in my life. They have a lousey limited faq sheet, common problems, email support is pitiful, and they don't take voice support. .Mac is a joke for $100.00 a year.
In general Apple's entire help system in OS X sucks. Searchs within the context of an application gives you all kinds of crap from every application on the system. Also there is no depth to the system. If your problem isn't the most elementary problem possible (99% of which you can figure out yourself) then it won't be in any of the help files.
As far as .Mac goes it is one of the most poorly supported systems I have ever used in my life. They have a lousey limited faq sheet, common problems, email support is pitiful, and they don't take voice support. .Mac is a joke for $100.00 a year.
In general Apple's entire help system in OS X sucks. Searchs within the context of an application gives you all kinds of crap from every application on the system. Also there is no depth to the system. If your problem isn't the most elementary problem possible (99% of which you can figure out yourself) then it won't be in any of the help files.
littleman23408
Dec 2, 03:03 PM
They kind of cant do more detailed damage to standard cars. Premium cars are modeled exactly right their real counterpart. Each body part is completely separate from the rest and can be torn off in a collision. Standard cars are one big mesh that can be dented, but not broken apart. In order to give the same level of damage to a standard car they'd have to update it to a premium model.
I've heard/read chatter that some patches will update some standard cars to premium, but i dont think i've seen anything official yet. Kaz is way too ambitious and had to cut a lot out of the game already. I expect he'll add it in as time goes on, as patches and not paid DLC.
Ah! I didn't realize that. Good to know!
I've heard/read chatter that some patches will update some standard cars to premium, but i dont think i've seen anything official yet. Kaz is way too ambitious and had to cut a lot out of the game already. I expect he'll add it in as time goes on, as patches and not paid DLC.
Ah! I didn't realize that. Good to know!
PeterQVenkman
Apr 27, 11:29 AM
I don't feel like reading through all the butt hurt comments and strangely political attacks in this thread so I'll just ask:
How do we know that Apple anonymizes data they do send?
How do we know that Apple anonymizes data they do send?
shawnce
Jul 20, 11:56 AM
yes, its known as reverse hyper threading. AMD are working on it
http://www.dvhardware.net/article10901.html
Reverse hyperthreading? Um, no. (http://arstechnica.com/news.ars/post/20060713-7263.html)
(also note that the article you link even notes that it was a "hoax")
http://www.dvhardware.net/article10901.html
Reverse hyperthreading? Um, no. (http://arstechnica.com/news.ars/post/20060713-7263.html)
(also note that the article you link even notes that it was a "hoax")
Yvan256
Apr 19, 02:01 PM
why? iphones outselling itouches by so much makes sense to me.
But it doesn't make sense to a lot of us. The monthly fees on an iPhone are just too much for a lot of budgets. You pay your iPod touch once and that's it. No more to pay every month after that.
But it doesn't make sense to a lot of us. The monthly fees on an iPhone are just too much for a lot of budgets. You pay your iPod touch once and that's it. No more to pay every month after that.
GuitarDTO
Mar 31, 04:43 PM
Man do these stories bring out the ignoranus fanboys. IMO if you have never owned both an Android phone and an iPhone, you shouldn't be allowed to comment because 99% just can't be objective about it.
Now, I'll hop on my pedestal and say I owned the original Moto Droid, and now own an iPhone. The ability to customize your experience on a droid is what I found so attractive, and Google isn't taking that away, so IMO this story is nothing but good for Android. Better control, more polish, yet the same customization capability that the majority of everyday users want. All of the iBoys tooting their horns and patting each other are doing so for absolutely no reason.
With that said, the polish of the iPhone is what I love the most about it, and if I could pair that polish with Androids ability for personalization of my device without jailbreaking and their much superior notification system, it would be the perfect phone. The next device to get it all right gets my money, whether its apple or Google.
Now, I'll hop on my pedestal and say I owned the original Moto Droid, and now own an iPhone. The ability to customize your experience on a droid is what I found so attractive, and Google isn't taking that away, so IMO this story is nothing but good for Android. Better control, more polish, yet the same customization capability that the majority of everyday users want. All of the iBoys tooting their horns and patting each other are doing so for absolutely no reason.
With that said, the polish of the iPhone is what I love the most about it, and if I could pair that polish with Androids ability for personalization of my device without jailbreaking and their much superior notification system, it would be the perfect phone. The next device to get it all right gets my money, whether its apple or Google.
Iconoclysm
Apr 19, 08:24 PM
WRONG! They weren't invented at Apple's Cupertino HQ, they were invented back in Palo Alto (Xerox PARC).
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
Actually, you're WRONG!!!! to say he's wrong. You're trying to say that every GUI element was created at Xerox? EVERY one of them? Sorry, but your argument here is akin to something Fox News would air.
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
Actually, you're WRONG!!!! to say he's wrong. You're trying to say that every GUI element was created at Xerox? EVERY one of them? Sorry, but your argument here is akin to something Fox News would air.
Amazing Iceman
Mar 31, 10:02 PM
I've really loved my experience with Android so far. I've had an iPhone and a iPhone 3G and I am an iPhone developer.... yet I use Android.
Android will always be "open source" and this is not inconsistent with Google applying more control to stem inoperable fragmentation. These two ideas are not at odds.
I cannot wait for Google to do what I think Amazon is currently trying to do with their new App. Store.
That said I really like the new iPad 2, but sadly my next purchase would prolly be a i7 MacBook Pro.
Just a quick question, hopefully not out of topic:
Which one do you prefer as a developer, not as a user: iOS or Android?
Good choice about the MBP i7. It's been over 3 years since I got my MBP, and it's time to replace it, but I may get an i7 iMac instead, as I now carry my iPad everywhere.
If a really good MBP comes out, I may reconsider and get one instead of the iMac. Too soon to decide.
Android will always be "open source" and this is not inconsistent with Google applying more control to stem inoperable fragmentation. These two ideas are not at odds.
I cannot wait for Google to do what I think Amazon is currently trying to do with their new App. Store.
That said I really like the new iPad 2, but sadly my next purchase would prolly be a i7 MacBook Pro.
Just a quick question, hopefully not out of topic:
Which one do you prefer as a developer, not as a user: iOS or Android?
Good choice about the MBP i7. It's been over 3 years since I got my MBP, and it's time to replace it, but I may get an i7 iMac instead, as I now carry my iPad everywhere.
If a really good MBP comes out, I may reconsider and get one instead of the iMac. Too soon to decide.
balamw
Aug 7, 04:36 PM
I'm not comparing it to system restore but to Volume Shadow Copy from Windows Server 2003. File-by-file snapshot by MS 3 years ago!
Again, fundamentally, a feature that existed in VMS over fifteen years ago and had been promised as part of Cairo. It took them over ten years to finally release it in Windows? :p
I'll have to ask my firendly IT guy, but how does the end user access shadow copies?
Exactly my thoughts!! Looks like a Trekie (how do you write that??) was let loose :D
Actually had a very Star Wars like feel to it with the angled text. Maybe one of the former Lucasfilm/ILM people from Pixar? :p
"A long time ago in a galaxy far away, a user created a file and overwrote its contents...."
B
Again, fundamentally, a feature that existed in VMS over fifteen years ago and had been promised as part of Cairo. It took them over ten years to finally release it in Windows? :p
I'll have to ask my firendly IT guy, but how does the end user access shadow copies?
Exactly my thoughts!! Looks like a Trekie (how do you write that??) was let loose :D
Actually had a very Star Wars like feel to it with the angled text. Maybe one of the former Lucasfilm/ILM people from Pixar? :p
"A long time ago in a galaxy far away, a user created a file and overwrote its contents...."
B
EagerDragon
Aug 25, 07:36 PM
Kind of a rude reply to someone who is just posting their experience with Apple.
Without criticism there would never be a reason to improve anything.
100% agree, there are manufactoring mistakes, the man should have a right to complain, lets not be rude. Sorry about that, people should not treat you like that.
Without criticism there would never be a reason to improve anything.
100% agree, there are manufactoring mistakes, the man should have a right to complain, lets not be rude. Sorry about that, people should not treat you like that.
MacSawdust
Aug 26, 10:40 AM
This nows explains why mine is not valid.
sehix
Nov 28, 09:44 PM
Actually, they do. They also got paid on every blank tape sold when cassettes were big. I think it is crazy for everyone to think that the music industry is greedy when it getting squeezed out of all of their revenue streams.
Actually, they aren't. They're making noises like it's happening, which isn't the same thing.
Hairstyles Long Hair
Long Hairstyles for Prom: Prom
Long Hairstyles for Prom: Prom
Actually, they aren't. They're making noises like it's happening, which isn't the same thing.
Popeye206
Apr 19, 02:10 PM
Is that your vetted legal opinion?
We have a lot of couch lawyers in this group. :rolleyes:
We have a lot of couch lawyers in this group. :rolleyes:
jeanlain
Apr 10, 06:56 AM
I don't ever recall Apple ever placing any presence at/during NAB or AES
Phil Schiller showing off final cut pro 4 and DVD sp 2 at NAB 2003 says hello.
Apple was on stage at several NAB. Final Cut Pro itself was introduced there.
Phil Schiller showing off final cut pro 4 and DVD sp 2 at NAB 2003 says hello.
Apple was on stage at several NAB. Final Cut Pro itself was introduced there.
nighthawk
Jul 20, 09:03 AM
At some point your going to have deminished returns. Sure multimedia apps can take advantage of a few more cores, but I dont see Mail running faster on 4 cores, nevermind 2! The nice thing about intel is that they seem to realise that, and have invested in improved IO as well, look at Pci express and SATA, you can have the fastest processor in the world, but if your running it with 512megs of memory your going to slow down fast!
2010 Prom Hairstyles for Long
2IS
Apr 8, 09:59 PM
Most people use their MBA for browsing, youtube videos, email, office apps and perhaps video conferencing. None of which will be bottlenecked by the Intel IGP. If you're doing something above and beyond this that will be negatively affected by the IGP, you are in fact, the minority.
Erasmus
Jul 20, 11:21 PM
The nec-plus-ultra would be thinking of a result and getting it (or saying it to your computer) like a photoshop user going: "Well, I would like the sun being more dominant in that picture, the power lines removed, and make those persons look younger". Boom. It happens.
<offtopic>
That would require Artificial Intelligence. If a computer can understand your speech, recognise your choice of words and understands that you don't neccessarily mean what you say all the time, then that's AI. If it can recognise specific objects in an "analogue" media such as a photograph, (I don't care if its a digital photo or not), it's AI. If it can then implement what it has learned alongside its infinite computational precision to remake a photo, while keeping it completely realistic, and making it look exactly how we wanted it to look, that's amazing, and lots of people will be out of jobs.
But if you have an AI system working for you, what's the point of working? ;)
BTW, I mean proper "hard" AI, not some pathetic "Ooh, forom your phone number you must live there, therefore I'll direct you to that Pizza Hut outlet! Aren't I smart!" type of AI.
<rant>
Erasmus 4 AI, Nuclear Power, GM, Stem Cell Research, and every other form of Science and Technology. Our lives will only benefit from all these, as will our community and our planet.
</rant>
</offtopic>
Don't Hurt Me.
I have to ask again, even though others already have, is Kentsfield a drop-in replacement for Conroe, if either a Mid-Tower or the iMac get Conroe? (Or Cloverton or whatever the desktop one is)
Still hanging out for WWDC2006.
<offtopic>
That would require Artificial Intelligence. If a computer can understand your speech, recognise your choice of words and understands that you don't neccessarily mean what you say all the time, then that's AI. If it can recognise specific objects in an "analogue" media such as a photograph, (I don't care if its a digital photo or not), it's AI. If it can then implement what it has learned alongside its infinite computational precision to remake a photo, while keeping it completely realistic, and making it look exactly how we wanted it to look, that's amazing, and lots of people will be out of jobs.
But if you have an AI system working for you, what's the point of working? ;)
BTW, I mean proper "hard" AI, not some pathetic "Ooh, forom your phone number you must live there, therefore I'll direct you to that Pizza Hut outlet! Aren't I smart!" type of AI.
<rant>
Erasmus 4 AI, Nuclear Power, GM, Stem Cell Research, and every other form of Science and Technology. Our lives will only benefit from all these, as will our community and our planet.
</rant>
</offtopic>
Don't Hurt Me.
I have to ask again, even though others already have, is Kentsfield a drop-in replacement for Conroe, if either a Mid-Tower or the iMac get Conroe? (Or Cloverton or whatever the desktop one is)
Still hanging out for WWDC2006.
REDolution
Apr 12, 12:49 PM
This was a very good blog post.
I agree, great read
I agree, great read
THX1139
Jul 23, 05:03 PM
..$999 - Dual 2 GHz One Conroe
$1399 - Dual 2.3 GHz One Conroe
$1699 - Dual 2.6 GHz One Conroe
$1999 - Quad 2.3 GHz Two Woodies later One Kentsfield
This is all just a wild guestimate for discussion purposes. Please don't flame me.
At those prices, sign me up for a Quad 2.3!!!! I'll buy that along with a newly designed 23" ACD for $699. :D
$1399 - Dual 2.3 GHz One Conroe
$1699 - Dual 2.6 GHz One Conroe
$1999 - Quad 2.3 GHz Two Woodies later One Kentsfield
This is all just a wild guestimate for discussion purposes. Please don't flame me.
At those prices, sign me up for a Quad 2.3!!!! I'll buy that along with a newly designed 23" ACD for $699. :D
Virtualball
Apr 19, 02:32 PM
It appears from the F700's standpoint though the natural progression became TouchWiz.
Wrong. Just because a company released one phone that has a similar look as the iPhone doesn't mean their current offerings are a progression of that phone. It's a true testament as to who browses this forum if you honestly think that. The F700 didn't run an advanced OS, so it probably ran Symbian or used BREW. That means all Samsung did was create a theme. How does a theme they made 3 years prior to the Galaxy S mean it's a progression on the coding and UI they built? It doesn't. Here's a list of every Samsung phone: http://en.wikipedia.org/wiki/Category:Samsung_mobile_phones Now, pick out one of those and say it inspired all of their new devices 3 years later.
The F700 was an iPhone clone with a keyboard. It's depressing that people are saying that the iPhone copied its own clone.
Wrong. Just because a company released one phone that has a similar look as the iPhone doesn't mean their current offerings are a progression of that phone. It's a true testament as to who browses this forum if you honestly think that. The F700 didn't run an advanced OS, so it probably ran Symbian or used BREW. That means all Samsung did was create a theme. How does a theme they made 3 years prior to the Galaxy S mean it's a progression on the coding and UI they built? It doesn't. Here's a list of every Samsung phone: http://en.wikipedia.org/wiki/Category:Samsung_mobile_phones Now, pick out one of those and say it inspired all of their new devices 3 years later.
The F700 was an iPhone clone with a keyboard. It's depressing that people are saying that the iPhone copied its own clone.
nsjoker
Aug 11, 12:13 PM
this phone is going to have to be pretty amazing for me to get one.. I'm talking a full-fledged iPod with capabilities of a great cell phone.. and decently priced. terminating my contract just isn't worth it from an economical point of view.
No comments:
Post a Comment