nozebleed
May 5, 10:51 AM
I still believe its just where you are at in the country. This graph is the exact opposite of what I experience. Verizon work phone - SHITE. Dropped calls so bad I forwarded the number to my iPhone. AT&T personal phone - no dropped calls.
AidenShaw
Sep 21, 11:15 AM
...you have a Tivo - you have made the decision to keep your recorded TV media in its traditional place - the living room / den.
The iTV concept starts from the premis that this is an outdated concept.
You have some interesting points, but for some people there are other considerations....
short haircuts for women over
short hairstyles for round
hair styles for women over 40
quot;short hairstyles round
hairstyles for round faces
short haircuts for round faces
Short Funky Hair Styles
good hairstyles for round
short hair styles for women
short haircuts for round faces
short hairstyles round face
short haircuts fine hair
short hairstyles for fine hair
pictures of popular haircuts
Hairstyles For Round Faces
Updo Short Hairstyles for Long
Asian Hairstyles For Round
The iTV concept starts from the premis that this is an outdated concept.
You have some interesting points, but for some people there are other considerations....
mattk3650
Apr 5, 09:23 PM
Wanna know the reason behind this. People on Verizon don't have the iPhone and aren't leaving the company so they just buy the next best thing.
If there's no iPhone on Verizon before 2011 I'm getting a Droid so hurry up Apple.
If there's no iPhone on Verizon before 2011 I'm getting a Droid so hurry up Apple.
eleven59
Apr 10, 11:00 AM
This must be me but I've never cared to have a program maximized on my Mac... Not even games. I always prefer to see multiple programs so I can click easily on any when needed.... It's also nice that just hovering over one let's you scroll thru it without actually clicking on it....
And resizing.. That takes me less than a second to drag and resize a window to what I want it to be.. if I even have to
And resizing.. That takes me less than a second to drag and resize a window to what I want it to be.. if I even have to
mroddjob
Apr 13, 05:24 AM
I'm confused as to why everyone is saying this is a step down from FCP7, from what I saw of the feeds apple were just showcasing some of the new features. I may be wrong but i didn't see anywhere where they said they were taking functionality out. They didn't mention color or the rest of FCS but they didn't say they were getting rid of everything. So how can people say this has dropped down to a prosumer level?
If all they did was re-write with 64 bit support then it would be a step up, but they also added some new useful features, (may not be game changing but i'm sure everyone will find something that will be helpful rather than a hindrance). In which case, in my book this still makes it pro software.
It was a first look at beta software, they haven't said they've removed anything so everything people are saying is just speculation for the sake of complaining.
If all they did was re-write with 64 bit support then it would be a step up, but they also added some new useful features, (may not be game changing but i'm sure everyone will find something that will be helpful rather than a hindrance). In which case, in my book this still makes it pro software.
It was a first look at beta software, they haven't said they've removed anything so everything people are saying is just speculation for the sake of complaining.
Gelfin
Apr 24, 03:03 PM
In answer to the OP's question, I have long harbored the suspicion (without any clear idea how to test it) that human beings have evolved their penchant for accepting nonsense. On the face of it, accepting that which does not correspond with reality is a very costly behavior. Animals that believe they need to sacrifice part of their food supply should be that much less likely to survive than those without that belief.
My hunch, however, is that the willingness to play along with certain kinds of nonsense games, including religion and other ritualized activities, is a social bonding mechanism in humans so deeply ingrained that it is difficult for us to step outside ourselves and recognize it for a game. One's willingness to play along with the rituals of a culture signifies that his need to be a part of the community is stronger than his need for rational justification. Consenting to accept a manufactured truth is an act of submission. It generates social cohesion and establishes shibboleths. In a way it is a constant background radiation of codependence and enablement permeating human existence.
If I go way too far out on this particular limb, I actually suspect that the ability to prioritize rational justification over social submission is a more recent development than we realize, and that this development is still competing with the old instincts for social cohesion. Perhaps this is the reason that atheists and skeptics are typically considered more objectionable than those with differing religious or supernatural beliefs. Playing the game under slightly different rules seems less dangerous than refusing to play at all.
Think of the undertones of the intuitive stereotype many people have of skeptics: many people automatically imagine a sort of bristly, unfriendly loner who isn't really happy and is always trying to make other people unhappy too. There is really no factual basis for this caricature, and yet it is almost universal. On this account, when we become adults we do not stop playing games of make-believe. Instead we just start taking our games of make-believe very seriously, and our intuitive sense is that someone who rejects our games is rejecting us. Such a person feels untrustworthy in a way we would find hard to justify.
Religions are hardly the only source of this sort of game. I suspect they are everywhere, often too subtle to notice, but religions are by far the largest, oldest, most obtrusive example.
My hunch, however, is that the willingness to play along with certain kinds of nonsense games, including religion and other ritualized activities, is a social bonding mechanism in humans so deeply ingrained that it is difficult for us to step outside ourselves and recognize it for a game. One's willingness to play along with the rituals of a culture signifies that his need to be a part of the community is stronger than his need for rational justification. Consenting to accept a manufactured truth is an act of submission. It generates social cohesion and establishes shibboleths. In a way it is a constant background radiation of codependence and enablement permeating human existence.
If I go way too far out on this particular limb, I actually suspect that the ability to prioritize rational justification over social submission is a more recent development than we realize, and that this development is still competing with the old instincts for social cohesion. Perhaps this is the reason that atheists and skeptics are typically considered more objectionable than those with differing religious or supernatural beliefs. Playing the game under slightly different rules seems less dangerous than refusing to play at all.
Think of the undertones of the intuitive stereotype many people have of skeptics: many people automatically imagine a sort of bristly, unfriendly loner who isn't really happy and is always trying to make other people unhappy too. There is really no factual basis for this caricature, and yet it is almost universal. On this account, when we become adults we do not stop playing games of make-believe. Instead we just start taking our games of make-believe very seriously, and our intuitive sense is that someone who rejects our games is rejecting us. Such a person feels untrustworthy in a way we would find hard to justify.
Religions are hardly the only source of this sort of game. I suspect they are everywhere, often too subtle to notice, but religions are by far the largest, oldest, most obtrusive example.
thogs_cave
Jul 12, 11:53 AM
your all looking at the server specs which have no need for more than 8x pci-e, if that.
Actually, I was just reading a bit on PCI-E, and apparently even the beefy dual-card (SLI) GFX don't saturate a pair of 8x slots. Quad SLI might need 16x, but for one or even two cards the boost from 8x to 16x is pretty much a wash.
(And this was from a PeeCee magazine!)
Actually, I was just reading a bit on PCI-E, and apparently even the beefy dual-card (SLI) GFX don't saturate a pair of 8x slots. Quad SLI might need 16x, but for one or even two cards the boost from 8x to 16x is pretty much a wash.
(And this was from a PeeCee magazine!)
munkery
May 2, 04:56 PM
Again, look, if you're not interested in the mechanics, that's fine. Stop replying to me.
My post is inquiring about the mechanics. For the past hour, I've been trying to find how this thing ticks by searching around for in-depth articles (none to find, everyone just points to Intego's brief overview that is seriously lacking in details) or for the archive itself.
If you don't want to take this discussion to the technical level I am trying to take it, just don't participate.
The Javascript exploit injected code into the Safari process to cause the download of a payload. That payload was the installer. (EDIT: the Javascript code did not exploit a vulnerability in Safari).
The installer is marked as safe to auto-execute if "open safe files after downloading" is turned on.
An installer is used to trick users to authenticate because the malware does not include privilege escalation via exploitation.
If you had any technical knowledge you could have figured that out yourself via the Intego article.
I don't know of any other Web browser (this is not a OS problem, it's a Safari problem) that automatically assumes executables are safe and thus should be auto-executed.
Installers being marked as safe really doesn't increase the likelihood of user level access as any client-side exploit provides user level access. I don't understand why you are hung up on this installer being able to auto-execute; it really makes no difference in terms of user level access. The attacker could have deleted your files with just an exploit that provides user level access.
What does Webkit2 have anything to do with running an installer on the OS after downloading it ? That happens outside the rendering engine's sandbox. You're not quite understanding what this sandbox does if you think this protects you against these types of attacks.
Webkit2 will prevent user level access via an exploit. Preventing these types of attacks is the intended purpose of sandboxing.
My post is inquiring about the mechanics. For the past hour, I've been trying to find how this thing ticks by searching around for in-depth articles (none to find, everyone just points to Intego's brief overview that is seriously lacking in details) or for the archive itself.
If you don't want to take this discussion to the technical level I am trying to take it, just don't participate.
The Javascript exploit injected code into the Safari process to cause the download of a payload. That payload was the installer. (EDIT: the Javascript code did not exploit a vulnerability in Safari).
The installer is marked as safe to auto-execute if "open safe files after downloading" is turned on.
An installer is used to trick users to authenticate because the malware does not include privilege escalation via exploitation.
If you had any technical knowledge you could have figured that out yourself via the Intego article.
I don't know of any other Web browser (this is not a OS problem, it's a Safari problem) that automatically assumes executables are safe and thus should be auto-executed.
Installers being marked as safe really doesn't increase the likelihood of user level access as any client-side exploit provides user level access. I don't understand why you are hung up on this installer being able to auto-execute; it really makes no difference in terms of user level access. The attacker could have deleted your files with just an exploit that provides user level access.
What does Webkit2 have anything to do with running an installer on the OS after downloading it ? That happens outside the rendering engine's sandbox. You're not quite understanding what this sandbox does if you think this protects you against these types of attacks.
Webkit2 will prevent user level access via an exploit. Preventing these types of attacks is the intended purpose of sandboxing.
ezekielrage_99
Sep 26, 12:34 AM
Until they get the 45nm process up and going, I think this is going to be the top of the line. 4 cores topping out around the mid 2GHz range.
I wonder if this is Intel's long term strategy-- keep the cores relatively untouched, but double the number with each process step. That'll be entertaining for a generation or so, but they're going to have to come up with something else.
Sounds like both Intel and AMD are going by the philosophy more cores more speed.
It looks like the programmers will be in for a fun old time.
I wonder if this is Intel's long term strategy-- keep the cores relatively untouched, but double the number with each process step. That'll be entertaining for a generation or so, but they're going to have to come up with something else.
Sounds like both Intel and AMD are going by the philosophy more cores more speed.
It looks like the programmers will be in for a fun old time.
iliketyla
Apr 20, 06:27 PM
And that's why I find it hilarious how Android enthusiasts always state how "Apple's closed garden" is a negative element, when it's the unregulated nature of Android that degrades the experience.
Please explain to me how I am experiencing a "degraded" experience on my current Android phone?
I can do everything your iPhone can, plus tether at no additional cost and download any song I want for free.
Ease of use in Android is just as simple as an iPhone, with the ability to customize IF YOU SO PLEASE.
So if you would, cut the degraded experience crap.
Please explain to me how I am experiencing a "degraded" experience on my current Android phone?
I can do everything your iPhone can, plus tether at no additional cost and download any song I want for free.
Ease of use in Android is just as simple as an iPhone, with the ability to customize IF YOU SO PLEASE.
So if you would, cut the degraded experience crap.
MacinDoc
Apr 13, 01:25 AM
I've been in IT for a while. "Professionals" are some of the most set in their ways people I have EVER met. I know guys who were annoyed when motherboards became available that let you adjust things like clock multipliers and such in the BIOS instead of having to use jumpers on the motherboard.
Most "professionals" aren't so much masters of their craft but people who understand how to use certain tools. If those tools become available to anyone the "professionals" feel threatened and lash out.
Mind you, while I love OS X, if the terminal was ever removed from the OS I'd cease using it. Once you know how to use a shell properly there's tons of stuff that's simply easier to do from there. I love ease, just so long as it's not at the cost of Pro grade functionality when I need it.
That's my point, though. Adding a graphic interface to OS X did nothing to reduce the power of the Terminal. As you say, as long as the choice is still available to use the underlying power, we should not object if ease of use is added on top of that. I think most video editors would want the video software equivalent of a DSLR, rather than the equivalent of a point-and-shoot camera. Ease of use for everyday things, but the power of manual controls when needed.
Most "professionals" aren't so much masters of their craft but people who understand how to use certain tools. If those tools become available to anyone the "professionals" feel threatened and lash out.
Mind you, while I love OS X, if the terminal was ever removed from the OS I'd cease using it. Once you know how to use a shell properly there's tons of stuff that's simply easier to do from there. I love ease, just so long as it's not at the cost of Pro grade functionality when I need it.
That's my point, though. Adding a graphic interface to OS X did nothing to reduce the power of the Terminal. As you say, as long as the choice is still available to use the underlying power, we should not object if ease of use is added on top of that. I think most video editors would want the video software equivalent of a DSLR, rather than the equivalent of a point-and-shoot camera. Ease of use for everyday things, but the power of manual controls when needed.
Chaos123x
Apr 13, 12:43 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8F190 Safari/6533.18.5)
Day one purchase. Been dying to get all of my 8 cores working in FCP for years.
Of course I'm gonna keep my current FCP installed till the bugs are fixed and I learn the new version.
Day one purchase. Been dying to get all of my 8 cores working in FCP for years.
Of course I'm gonna keep my current FCP installed till the bugs are fixed and I learn the new version.
TennisandMusic
Apr 21, 04:12 PM
If you don't mind, I would like to explain that.
I cannot vouch for all the people. I can vouch for most that I have seen.
I am a part of TI, SerDes which is designed in TI, UK [UK Design]. I have been to TI's headquarters [Dallas, Texas], a number of items, and everytime I go, I have seen people using iPhones and blackberries. TI still gives BB's to all the employees, but most have their personal iPhones. It was really hard to spot a guy using an android phone out of close to a thousand people I could spot on campus.
We run most of our software on SunOS 2.6 [Solaris]. We do some of our development work on Windows [which is a PAIN in the OS for no native support for PERL, Python, ClearCase, etc].
The reason I believe that's the case is because:
1. The most important: people have a life. They don't wish to tinker with the phones; whether its easy or hard, they just have no time. We buy smartphones to work for us and do everything on their own. We don't want to work for our 'smartphone' to make it usable. People just don't have time.
2. The quality of service Apple provides is hands down. The best customer service for any product that is theirs. It's great.
3. iPhone is probably the most usable phone at this time. Android is just on the other side. Widgets/Customization that's about it. Low quality apps/ No apps is the case there.
People want something that just works without much effort. These things are to simplify our lives and not complicate, so that we can concentrate on actual work.
Some people get this; some don't.
Yeah I pretty much agree on those points. I've had them all, had the iPhone 4, bought an android (Galaxy S) and a windows phone 7 (Samsung Focus) and am now back on the iPhone 4 with no regrets.
I cannot vouch for all the people. I can vouch for most that I have seen.
I am a part of TI, SerDes which is designed in TI, UK [UK Design]. I have been to TI's headquarters [Dallas, Texas], a number of items, and everytime I go, I have seen people using iPhones and blackberries. TI still gives BB's to all the employees, but most have their personal iPhones. It was really hard to spot a guy using an android phone out of close to a thousand people I could spot on campus.
We run most of our software on SunOS 2.6 [Solaris]. We do some of our development work on Windows [which is a PAIN in the OS for no native support for PERL, Python, ClearCase, etc].
The reason I believe that's the case is because:
1. The most important: people have a life. They don't wish to tinker with the phones; whether its easy or hard, they just have no time. We buy smartphones to work for us and do everything on their own. We don't want to work for our 'smartphone' to make it usable. People just don't have time.
2. The quality of service Apple provides is hands down. The best customer service for any product that is theirs. It's great.
3. iPhone is probably the most usable phone at this time. Android is just on the other side. Widgets/Customization that's about it. Low quality apps/ No apps is the case there.
People want something that just works without much effort. These things are to simplify our lives and not complicate, so that we can concentrate on actual work.
Some people get this; some don't.
Yeah I pretty much agree on those points. I've had them all, had the iPhone 4, bought an android (Galaxy S) and a windows phone 7 (Samsung Focus) and am now back on the iPhone 4 with no regrets.
AceWilfong
Apr 24, 03:34 PM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8F190 Safari/6533.18.5)
People don't like the idea of no longer existing, and religion solves that.
Plus, it is a way to control people. A very effective one! That's why it is still here today in the age of science. Religion has been refined over thousands of years to make sure it keeps itself going and keeps people believing without question.
This book says there is an invisible man in the sky who made the earth. We know this because the invisible man wrote the book. He listens to you but doesn't answer. If you do as he says you go to a wonderful afterlife, but if you don't you go to a horrible one.
Excellent! And it would not surprise me to learn that religion was invented by Kings, not Gods.
People don't like the idea of no longer existing, and religion solves that.
Plus, it is a way to control people. A very effective one! That's why it is still here today in the age of science. Religion has been refined over thousands of years to make sure it keeps itself going and keeps people believing without question.
This book says there is an invisible man in the sky who made the earth. We know this because the invisible man wrote the book. He listens to you but doesn't answer. If you do as he says you go to a wonderful afterlife, but if you don't you go to a horrible one.
Excellent! And it would not surprise me to learn that religion was invented by Kings, not Gods.
MacFly123
Mar 18, 02:50 PM
I can maybe get behind the whole 'dishonest' thing, but... seriously. If I have an iphone and an ipad, and I decide to surf some sites or stream music through pandora on my ipad using tethering instead of doing those exact same actions on my phone, I'm now 'stealing' that data even though it would have been the exact same usage?
I realize there are other scenarios you could bring up that would be more like 'taking advantage' of the system, but me personally- if I'm using the data in a way I feel is no different than I would be using with my phone, I don't have any bad conscience about it whether it's allowed or not.
I thought I made clear in my post that this is simply double billing what is supposed to be an unlimited plan for many and I do NOT agree nor think it is ethical for the carriers to do this! But, when people sign a contract and agree to the terms doing otherwise is not being honest. Plain and simple.
I realize there are other scenarios you could bring up that would be more like 'taking advantage' of the system, but me personally- if I'm using the data in a way I feel is no different than I would be using with my phone, I don't have any bad conscience about it whether it's allowed or not.
I thought I made clear in my post that this is simply double billing what is supposed to be an unlimited plan for many and I do NOT agree nor think it is ethical for the carriers to do this! But, when people sign a contract and agree to the terms doing otherwise is not being honest. Plain and simple.
Analog Kid
Sep 26, 12:21 AM
Until they get the 45nm process up and going, I think this is going to be the top of the line. 4 cores topping out around the mid 2GHz range.
I wonder if this is Intel's long term strategy-- keep the cores relatively untouched, but double the number with each process step. That'll be entertaining for a generation or so, but they're going to have to come up with something else.
My bet? Specialized cores. You've got some that are optimized for floating point, some for application logic, some for media. This is where Cell gets it right, I think-- they're a step too far ahead for now though.
Biggest problem is getting the system to know what threads to feed to what core, and to get application writers to specialize their threads.
I wonder if this is Intel's long term strategy-- keep the cores relatively untouched, but double the number with each process step. That'll be entertaining for a generation or so, but they're going to have to come up with something else.
My bet? Specialized cores. You've got some that are optimized for floating point, some for application logic, some for media. This is where Cell gets it right, I think-- they're a step too far ahead for now though.
Biggest problem is getting the system to know what threads to feed to what core, and to get application writers to specialize their threads.
326
Jun 18, 08:17 AM
new to the forums but not new with ATT. I used to own a nokia phone thru ATT and have never had any dropped call issues until after they merged with cingular which cingular used to be pacific bell cellular phones.
Pacific Bell cellular phones I used to be on years ago which lasted no more then 8months tops. Reason being was thier connection reliability was absolute junk. Didnt matter where I was standing the signal strength was garbage.
So then I switched to ATT not knowing that they two companys would merge a year and a half later.
During my time with ATT the signal strength was solid, secure and very reliable. Consistant.
Then the merge happend and the service customer service is where I noticed a significant Nose Dive heading south. Poor Service.
I continued to my time as an ATT customer being that the Nokia phone was still reliable and the signal strength consistant. Then upgraded to a Motorola flip which was also reliable.
When I made the move to the iPhone3g is when I noticed my signal strength consistancy begin to weaken. However I love my iPhone so much and use it for everything mobile that its tolerable.
I am hoping that this new anntenna system thats integrated in the new iPhone4 to put strong signal strength and reliability back into the hands that ATT used to have and be known for.
Hopefully one day this world will unify as one to focus forward to reach outside of the box, instead of focusing on the $ sign which divides the world into pieces:apple:
Pacific Bell cellular phones I used to be on years ago which lasted no more then 8months tops. Reason being was thier connection reliability was absolute junk. Didnt matter where I was standing the signal strength was garbage.
So then I switched to ATT not knowing that they two companys would merge a year and a half later.
During my time with ATT the signal strength was solid, secure and very reliable. Consistant.
Then the merge happend and the service customer service is where I noticed a significant Nose Dive heading south. Poor Service.
I continued to my time as an ATT customer being that the Nokia phone was still reliable and the signal strength consistant. Then upgraded to a Motorola flip which was also reliable.
When I made the move to the iPhone3g is when I noticed my signal strength consistancy begin to weaken. However I love my iPhone so much and use it for everything mobile that its tolerable.
I am hoping that this new anntenna system thats integrated in the new iPhone4 to put strong signal strength and reliability back into the hands that ATT used to have and be known for.
Hopefully one day this world will unify as one to focus forward to reach outside of the box, instead of focusing on the $ sign which divides the world into pieces:apple:
shawnce
Sep 26, 11:01 AM
My 2.66GHz MacPro doesn't use all four cores except on rare occassions (e.g. benchmarks, quicktime, handbrake, etc.) and even then it doesn't peg them all.
In other words your average work load doesn't contain enough concurrent work items that are CPU bound.
What I'm most interested in is offloading OpenGL to a core, the GUI to another core, etc. ...some what a nonsensical statement...
Threads of work are spread across available cores automatically. If a thread is ready to run and a core is idle then that thread will run on that core.
Aspects of the "UI" frameworks are multithread and will automatically utilize one or more cores (in some cases the frameworks increase the number of threads they use based on how many cores exist in the system). In other words the UI will already potentially use more then one core on a multi-core system.
The same can happen with OpenGL either now... say if the game developer for example utilizes one or more threads to calculate the game world state and a second thread to call into OpenGL to render that game world ...or by enabling the multithread OpenGL render (only available on Mac Pro systems at this time).
Of course that assumes that the tasks you run are CPU intensive enough to even begin to consume compute resources available to you in new systems... in the end you should measure overall throughput of the work load you want to do, not how utilized your individual core are when doing that work load.
In other words your average work load doesn't contain enough concurrent work items that are CPU bound.
What I'm most interested in is offloading OpenGL to a core, the GUI to another core, etc. ...some what a nonsensical statement...
Threads of work are spread across available cores automatically. If a thread is ready to run and a core is idle then that thread will run on that core.
Aspects of the "UI" frameworks are multithread and will automatically utilize one or more cores (in some cases the frameworks increase the number of threads they use based on how many cores exist in the system). In other words the UI will already potentially use more then one core on a multi-core system.
The same can happen with OpenGL either now... say if the game developer for example utilizes one or more threads to calculate the game world state and a second thread to call into OpenGL to render that game world ...or by enabling the multithread OpenGL render (only available on Mac Pro systems at this time).
Of course that assumes that the tasks you run are CPU intensive enough to even begin to consume compute resources available to you in new systems... in the end you should measure overall throughput of the work load you want to do, not how utilized your individual core are when doing that work load.
Hisdem
Mar 15, 01:39 PM
Are you drunk?
Looks like it. And BTW, I don't think the Japanese people would think leaving their homeland and going to the USA is a good idea. Not saying they don't like the US, but generally, just generally, people tend to care more about their own countries and cultures than about the American ones. Just saying.
Looks like it. And BTW, I don't think the Japanese people would think leaving their homeland and going to the USA is a good idea. Not saying they don't like the US, but generally, just generally, people tend to care more about their own countries and cultures than about the American ones. Just saying.
takao
Mar 15, 05:07 PM
according to current reports the roof of reactor 4 broke apart/collapsed and two workers are considered missing
also the fire which was put out earlier seems to have started again
also the fire which was put out earlier seems to have started again
ryme4reson
Oct 10, 02:59 AM
Well I tested my G4 933, and I have CHUD tools installed so I can disable my L2 and L3 cache. I also could not get the java to work so I compiled with C++, its the same stuff, but I used time() with gave me seconds, so * 1000 to get the adjusted scores
Here are my scores
933 256L2 2MBL3 79 seconds or 79000
933 NO L2 or L3 124 seconds or 124000
933 L2 only 79 seconds
933 L3 only 79 seconds
Judging by these scores I have to think that CHUD is not working and it only worked with completely disabled. as the diff of 45 seconds.
And you can get CHUD from apple ftp.apple.com
Needless to say it takes me 79 seconds when a PV is completing this in 5-10 seconds, something is wrong!! (the the G4)
Lastly, I have not seen BACKTOTHEMAC telling us how great the G4 is lately, must be installing Win 2K under VPC with a stopwatch in 1 hand, an apple in the other, and a smile on his face...
<EDIT> I am gonna try to run this on my brothers 333 celeron on a 66MHZ bus with 320 RAM, I know my 933 is not the fastest, but maybe it just found its competition. :) </EDIT>
Here are my scores
933 256L2 2MBL3 79 seconds or 79000
933 NO L2 or L3 124 seconds or 124000
933 L2 only 79 seconds
933 L3 only 79 seconds
Judging by these scores I have to think that CHUD is not working and it only worked with completely disabled. as the diff of 45 seconds.
And you can get CHUD from apple ftp.apple.com
Needless to say it takes me 79 seconds when a PV is completing this in 5-10 seconds, something is wrong!! (the the G4)
Lastly, I have not seen BACKTOTHEMAC telling us how great the G4 is lately, must be installing Win 2K under VPC with a stopwatch in 1 hand, an apple in the other, and a smile on his face...
<EDIT> I am gonna try to run this on my brothers 333 celeron on a 66MHZ bus with 320 RAM, I know my 933 is not the fastest, but maybe it just found its competition. :) </EDIT>
redkamel
Apr 13, 01:16 AM
When Apple's Pro App for photographers, Aperture, hit the App Store, the price dropped from $200 to only $80. Compare this to Adobe's $300 Lightroom app.
Providing Pro Apps at such low prices helps to establish Apple's hardware as more affordable. Today's young computer users bring a sophistication to application utilization that previous generations did not. High school students quickly outgrow iMovie's capabilities in their media classes and are prepared to move up.
Forget "Pro Apps"- these are "Advanced Apps" and, though the pros may not like it, these apps are going to make it into the hands of amateurs and hobbyists.As a professional photographer, I recommend Aperture to even the most novice digital photographer- if you can understand iPhoto, Aperture is within reach.
Ultimately, don't let the low price fool you. Volume of sales and baiting eager pro app users to the Apple OS will do more for Apple than trying to make these apps solely available to professionals. Software-only companies are at a big disadvantage here- selling inexpensive (and great) software will ultimately increase their overall sales as the hardware flies off the shelves.
I think a large part of it has to do with how Aperture is much more visual while PS is more menu based. It makes it much easier to learn.
I'd agree; Apple is dropping software prices for good reasons.
1. Computers are very powerful nowadays. It is stupid to make pro apps out of the reach of people who own prosumer machines...even a mid level macbook pro can run Aperture and FCP to some extent. Might as well use that power and sell software along with giving a halo effect to all your machines. FCP is linked to Apple. Avid, Lightroom are not.
2. It sells computers when amateurs or pros can get pro apps for cheap and vice versa. I know if I was OS neutral and owned a business or was an amateur, I'd rather have reliable, shiny "cool" macs with cheaper pro software, than cheaper windows boxes with expensive software. The functionality is likely equal, but the Apples will end up breaking even (cheaper software) and be more reliable.
3. Cheaper software means more people use it, which means it will eventually become more standard. I remember me and my friend having theories about Adobe "allowing" HS and college kids to pirate software because when they graduated, then that is all they knew...and they would have to buy it if they wanted to work, and businesses would have to buy it if they wanted to hire. A cheaper alternative to legal PS would be out of luck unless it could break that cycle. Ive been using Aperture since it came out. You think I want to work for someone using Lightroom or Aperture? (actually, i guess it doesnt really matter... :p work would be work)
Providing Pro Apps at such low prices helps to establish Apple's hardware as more affordable. Today's young computer users bring a sophistication to application utilization that previous generations did not. High school students quickly outgrow iMovie's capabilities in their media classes and are prepared to move up.
Forget "Pro Apps"- these are "Advanced Apps" and, though the pros may not like it, these apps are going to make it into the hands of amateurs and hobbyists.As a professional photographer, I recommend Aperture to even the most novice digital photographer- if you can understand iPhoto, Aperture is within reach.
Ultimately, don't let the low price fool you. Volume of sales and baiting eager pro app users to the Apple OS will do more for Apple than trying to make these apps solely available to professionals. Software-only companies are at a big disadvantage here- selling inexpensive (and great) software will ultimately increase their overall sales as the hardware flies off the shelves.
I think a large part of it has to do with how Aperture is much more visual while PS is more menu based. It makes it much easier to learn.
I'd agree; Apple is dropping software prices for good reasons.
1. Computers are very powerful nowadays. It is stupid to make pro apps out of the reach of people who own prosumer machines...even a mid level macbook pro can run Aperture and FCP to some extent. Might as well use that power and sell software along with giving a halo effect to all your machines. FCP is linked to Apple. Avid, Lightroom are not.
2. It sells computers when amateurs or pros can get pro apps for cheap and vice versa. I know if I was OS neutral and owned a business or was an amateur, I'd rather have reliable, shiny "cool" macs with cheaper pro software, than cheaper windows boxes with expensive software. The functionality is likely equal, but the Apples will end up breaking even (cheaper software) and be more reliable.
3. Cheaper software means more people use it, which means it will eventually become more standard. I remember me and my friend having theories about Adobe "allowing" HS and college kids to pirate software because when they graduated, then that is all they knew...and they would have to buy it if they wanted to work, and businesses would have to buy it if they wanted to hire. A cheaper alternative to legal PS would be out of luck unless it could break that cycle. Ive been using Aperture since it came out. You think I want to work for someone using Lightroom or Aperture? (actually, i guess it doesnt really matter... :p work would be work)
�algiris
May 2, 09:15 AM
Bigger, most Windows PC have anti-virus, can you say the same for Macs?
One thing Macs need anti-virus is to scan mails for Windows viruses, so that those doesn't to you PC. That is all.
One thing Macs need anti-virus is to scan mails for Windows viruses, so that those doesn't to you PC. That is all.
BladesOfSteel
May 5, 10:51 AM
I have had ATT for almost three years now - and I haven't had one dropped call.