Tuesday, June 1, 2010

Is Thunderbird Too Little, Too Late? | Linux Magazine

Is Thunderbird Too Little, Too Late? | Linux Magazine: "Is Thunderbird Too Little, Too Late?

You win some, you lose some. The Mozilla project has won big with Firefox, but not so much with Thunderbird. Thunderbird 3 is a decent mail user agent, but it doesn’t seem to have the right stuff to break out into widespread usage.
Joe Brockmeier
Tuesday, May 25th, 2010
Share This: Submit to Digg Submit to Reddit Tweet This! Add to Facebook Post to YCombinator Post to DZone Post to Slashdot Post to StumbleUpon Bookmark with Del.icio.us
Community Tools
Recommend This [?]
1 Star2 Stars3 Stars4 Stars5 Stars (5 votes, average: 4.6 out of 5)
Loading ... Loading ...
Users Who Liked This [?]
No one yet. Be the first.
Tags:
Tag This!

A few weeks ago, the Mozilla Messaging folks released the second beta for Thunderbird 3.1. The list of features amount to some nice improvements, but nothing revolutionary. One has to wonder if Thunderbird will ever be relevant to a wide audience, or if the Mozilla Messaging team should be focusing on doing more than incremental improvements to an old-school mailer.

I don’t mean to harsh on Thunderbird unduly, or disparage the good work being done by the Thunderbird developers. Criticizing Thunderbird feels a bit like kicking a puppy with a boot made out of kitten skins. It seems harsh to say, but Thunderbird appears to be floundering as a project and certainly isn’t grabbing market share the way Firefox has. That’s a shame, because the philosophy behind Mozilla Messaging is very user-centric.

It’s harder to find stats for mail client usage compared to finding stats for browser usage. Looking at some email campaign companies that fingerprint mailers, you’ll see Thunderbird with about 1% to 2.4% of the market. Compare this to more than 35% for Outlook (all variants) and a pretty hefty share for Webmail clients. The iPhone comes in with more than 8% according to Campaign Monitor.

Fuzzy Use Case

One of the problems with Thunderbird is that it doesn’t seem to fit with most users’ needs for email. That is, it doesn’t work well for business users who need features like calendaring and groupware connectivity, and it doesn’t work well for casual mail users who have mostly adopted Webmail or whatever ships on their computer.

If the stats compiled by the Campaign Monitor and Fingerprint are relatively accurate, it shows that most users are either working with Outlook, Webmail, or the mailer that ships with the iPhone or their Mac. In other words, casual users who need compelling features to be motivated to switch, or advanced users who need connectivity to Exchange.

Thunderbird isn’t well-suited in either case. Don’t get me wrong, Thunderbird is a decent mailer for advanced users. But it doesn’t seem to be focused on any specific use case. It’s too complex for casual users, and not full-featured enough for business users. The audience it is well suited for is not large enough to push it into double digits.

Why’s that a problem? Mozilla Messaging needs to find some ways to fund the project and encourage more developers. The Mozilla Foundation gets most of its money off the search deal with Google right now. It can earn big bucks to pay hundreds (yes, hundreds) of people to work on Firefox and other Mozilla projects because it has enough users. And Firefox is large enough that third parties want to participate and help make Firefox better. Thunderbird is not seeing that kind of momentum.

No Developer Momentum

One of the major problems that Thunderbird has is that it has very little in the way of a developer ecosystem. Firefox was made great, and massively popular, in large part thanks to its developer community, especially the enormous add-on community. Even before Firefox was as popular as it is today, it had a thriving add-on developer community.

Thunderbird? Not so much. Actually, the mailer seems to be going backwards a bit. A friend of mine was searching for an add-on to send out emails at a specific time. Nothing like that exists for Thunderbird 3.x, but there was an add-on that did this a couple of years ago. It just hasn’t been maintained. This was what started me thinking about Thunderbird and where it was going.

The Lightning and Sunbird calendaring projects have been struggling due to lack of developers. They have been in development for years, but still haven’t made it to 1.0.

Thunderbird Stands Alone

Another problem that Thunderbird has is a lack of a server-side solution. Sure, it handles IMAP and POP3, but good luck with Exchange, GroupWise, etc. On the consumer side, it lacks solutions like Mobile Me to sync contacts and such between computers.

And there’s no mobile Thunderbird solution in the picture or on the horizon. Thunderbird doesn’t fit well with the way many people are using mail.

The Raindrop project from Mozilla Labs looks interesting as a way to unify different messaging services. But it doesn’t really interact with Thunderbird.

No Killer Features

Thunderbird has a few nifty features in the 3.x series, notably around search. But really, Thunderbird doesn’t have any features I can think of that make it a “must have” over any other mailer, especially on Windows or Mac OS X.

When I compare Thunderbird to other Linux mailers, I can’t think of any features that make it super-compelling next to Evolution or KMail. It’s not bad, it just isn’t across-the-board better, either. And for power users, it’s probably not as interesting as Claws or Mutt. Thunderbird is moderately customizable, but not to the extent of Claws, Mutt, or Gnus for Emacs.

Suggestions

Despite finding flaw with Thunderbird as it is today, I’d really like to see Mozilla Messaging succeed, and succeed wildly. There should be little doubt that the Web is a better place today thanks to Mozilla Firefox, regardless of whether you use Firefox itself.

Email, calendaring, and other groupware is a cesspit today. Email has not improved significantly in the 15 years I’ve been using it. Calendaring is still a mess of corporate and individual silos that mean it’s next to impossible to conveniently set meetings between individuals or organizations.

It’d be a good idea for the Thunderbird folks to think seriously about fixing some of the back-end problems, and deciding whether to focus on consumer or corporate use. I’d recommend consumer for the time being. Moz should also think about not only fixing some of the back-end problems, but hosting mail for users as a way to help fund development. Perhaps even developing a Webmail solution instead of concentrating on a desktop client.

Thunderbird also needs to have some killer features that help it stand out from the pack. Being able to send timed email would be a start. Allowing users to annotate emails and do more contact management within the mailer would be another great feature. Thunderbird desperately needs calendaring. The project should stop focusing on Sunbird and make Lightning an optional part of the Thunderbird install.

A mobile strategy also seems like a necessity, though Thunderbird mobile seems unlikely. Apple probably wouldn’t approve it on the iPhone, and I’m imagining it would have a hard time gaining traction on Android or Blackberry.

Thunderbird is a decent mailer, but it’s not a game-changer the way that Firefox was. If it doesn’t improve drastically, it seems doomed to always have only a sliver of the market — which makes it unlikely the project as a whole will succeed. So far, the improvements in Thunderbird 3.x have been too little and too late to drive mass adoption.

- Sent using Google Toolbar"

Tesla’s Elon Musk: “I ran out of cash” | VentureBeat

Tesla’s Elon Musk: “I ran out of cash” | VentureBeat: "Tesla’s Elon Musk: “I ran out of cash”
May 27, 2010 | Owen Thomas
Comments
Share

Tesla Motors CEO Elon Musk seems to have it all. The electric-car entrepreneur is the toast of Silicon Valley, Sacramento, and Tokyo after unveiling a plan to revive Toyota’s shuttered NUMMI plant last week. And deal-hungry Wall Street bankers are angling to take his company public. He’s even a Hollywood star, with a cameo in the hit Iron Man 2 movie, said to be based on his life story.

The one thing he doesn’t have, by his own admission, is money.

“About four months ago, I ran out of cash,” he wrote in a court filing dated Feb. 23, reviewed by VentureBeat. That’s a problem not just for him but for Tesla, where he is the lead investor and chief product architect, as well as CEO. Musk’s willingness to funnel his own cash into Tesla has for years sustained the faith of fellow investors and reassured would-be car buyers in 2008 when the company’s finances were in perilous shape.

According to the filing — part of his pending divorce case from sci-fi novelist Justine Musk — Elon Musk has been living off personal loans from friends since October 2009 and spending $200,000 a month while making far less. Musk confirmed this in an interview with VentureBeat.

Tesla, likewise, is dealing with its cash flow problems by borrowing money from a friendly source — the United States government, which has eagerly backed cleantech startups through a Department of Energy loan program. Tesla burned through $37 million in cash in the last three months of 2009, according to amended S-1 documents, filed with the Securities & Exchange Commission in preparation for its IPO. Tesla slowed this burn rate in the first quarter of 2010 to $8.4 million, but only by drawing down part of a $465 million loan from the DOE, while reporting a net loss of $29.5 million. Tesla’s sales were flat year-over-year in the first quarter, but declined precipitously in the U.S., according to a former Tesla executive.

Now, Toyota has agreed to buy $50 million in shares at the time of Tesla’s initial public offering — if it manages to go public before Dec. 31. But for now, the company doesn’t have access to that promised cash, and must pay $42 million to buy the NUMMI plant in Fremont, Calif., from a Toyota-General Motors joint venture.

Only one thing is certain: Tesla’s not getting more money from Musk.

Divorced from his fortune

Musk was Tesla’s first investor, and he kept the company afloat until recently through round after round of funding. After a Tesla employee leaked word in October 2008 to a reporter that the company was down to its last $9 million in cash, Musk promised to personally refund car buyers’ deposits if Tesla couldn’t deliver the vehicles — a promise he made in the pages of Car & Driver. At that time, those deposits — which Tesla calls “reservation payments” — were an important source of cash for the company.

And throughout Tesla’s history, Musk has used his entrepreneurial legend — Zip2, sold for $305 million to Compaq; PayPal, sold to eBay for $1.5 billion — to bolster his credibility as a technology executive. Musk’s personal take from Zip2 was a reported $22 million, much of which he invested in his next startup, PayPal, netting $160 million when eBay bought the online-payments startup. According to filings in his divorce trial, he had roughly $48 million in income from his investments between 2005 and 2008. But he sunk much of that money back into Tesla, as well as his other enterprises, the space-exploration concern SpaceX and solar panel finance startup SolarCity.

His finances were not always so strained. In other documents filed in the divorce case, Musk reportedly made $9,551,753 in 2008 and an average of $17.2 million a year from 2005 to 2008. As of Dec. 31, 2008, he also had extensive holdings in venture capital and private equity partnerships, ranging from Softbank Technology Ventures to Charles River Ventures to Clarium Capital. These partnerships, however, tend to be highly illiquid investments: It can take months to get out of them because you have to find a sophisticated buyer willing to bear the risks of a private sale.

As he ran low on cash, a contentious divorce — in which his ex-wife, Justine Musk, is seeking a sizable chunk of Musk’s holdings — caused him more financial problems. Justine Musk is asking a court to rip up a post-nuptial agreement she and Elon Musk signed in March 2000, which could in theory lead to much of his holdings being deemed community property. While there’s no telling how the case will turn out — it has already gone to appeal — more important is the protective order the court has slapped on Musk’s holdings in Tesla and his other illiquid assets. These include his stakes in private equity funds. He won’t be able to sell significant holdings without first getting permission from his ex-wife. And he has also been ordered by a court to continue paying her legal fees for the duration of the lengthy appeal process.

Refueling Tesla’s cash

Musk still owns roughly a third of Tesla — some 81 million shares out of approximately 250 million outstanding, according to the company’s filings. But keeping his ownership stake that high has come at a cost. In November 2007, in order to wield enough voting power to oust Tesla co-founder Martin Eberhard as CEO, he converted 8 million of his preferred shares into common shares. Two months later, Musk participated in a bridge loan to rebuff a separate effort by VantagePoint Venture Partners, a significant investor, to lead a deal that would have seriously diluted Musk’s control. VantagePoint partner Jim Marver left Tesla’s board as a result. From the perspective of Musk’s board allies, the move steadied the company at a time of significant employee turnover and potential loss of morale. (A VantagePoint spokesman declined to comment on Tesla board matters.)

The moves kept Musk in control of Tesla, but it also meant that his stake kept getting diluted in subsequent financing rounds. (Preferred shares often hold anti-dilution rights, but common shares typically do not.) And there were many subsequent rounds, including a highly dilutive convertible debt round in 2008. The first sign of trouble came last fall, when Musk, for the first time, did not participate in a financing round for Tesla.

The company has not disclosed Musk’s lack of financial liquidity or the potential implications of his divorce case in its filings — only that it is highly dependent on Musk’s services. Tesla has also begun reimbursing Musk for his private-jet flights, an expense he previously paid out of pocket. And while Tesla pays Musk only a minimal salary, its board awarded him 6.7 million stock options in December 2009 — the first time he has taken this kind of equity as compensation. It seems that Musk’s compensation from Tesla has increased since his personal finances became an issue.

A matter of disclosure

Should Tesla have mentioned all these facts in its S-1 filings? Eric Talley, a professor of law at Berkeley and co-director of the Berkeley Center for Law, Business, and the Economy, notes that Section 11 of the 1933 Securities Act requires that companies registering to go public not make materially misleading statements or omissions. But it’s far from clear what’s material in these cases, he said: “It’s not a black and white rule.”

A longtime observer of the company thinks the state of Musk’s finances is worth disclosing. “It’s up to the courts to decide, but this feels like material information,” said Dallas Kachan, managing partner of Kachan & Co., a cleantech research and analysis consultancy which follows Tesla.

The easiest way for Musk to get out of debt to his friends and settle accounts with his ex-wife would be for Tesla to go public and for Musk to unload much of his stake. After an IPO, his shares of Tesla would become a readily sold asset — except for the protective orders in his divorce case and a requirement of the DOE loan that Musk hold onto a certain percentage of his shareholdings until some time after units of Tesla’s forthcoming Model S start rolling off the NUMMI assembly line.

Asked to comment on whether Tesla’s disclosures so far have been adequate, John Heine, deputy director of the Security & Exchange Commission’s Office of Public Affairs, said his agency does not comment on companies with pending registrations to the press. Ricardo Reyes, a spokesman for Tesla Motors, has previously said the company had no plans to revise its filings with the SEC to reflect the possible impact of Musk’s divorce as a risk factor.

Should the company have said more? Perhaps, argues one observer.

“Transparency is thought to be a good thing for the operation of capital markets,” said Talley, the Berkeley law professor. “Bare compliance with SEC rules isn’t enough.”

- Sent using Google Toolbar"

Slashdot Your Rights Online Story | Where Do You Go When Google Locks You Out?

Slashdot Your Rights Online Story | Where Do You Go When Google Locks You Out?: "Lobais sends in the cautionary tale of a man who was locked out of Google Groups for three years — losing the ability to administer his own open source project in the process. 'After about a year of using Google Groups for the PyChess project, I started [noticing] a problem. When I wrote mails to the list, no one would answer. And when I answered other peoples' post[s], they seamed to ignore them and press for new answers. As I tried to check the online group to see what was happening, I got a 403 Forbidden error. After a short while I realized that this error was given for any page on the groups.google.com subdomain. The lockout meant that I was unable to manage the PyChess mailing list. I was unable to fight increasing spam level, and more importantly I couldn't reply to anybody in my community. I wasn't even able to visit the Google help forums, which are all on groups.google.com. As the services are free of charge, I never really expected any support options. ... How can we know how often this kind of thing happens? If any admin can lock you out by a sloppy click, and give you no option to defend yourself, then it is bound to happen once in a while.'

- Sent using Google Toolbar"

Indexing timeline

Indexing timeline: "Matt Cutts: Gadgets, Google, and SEO
Indexing timeline

May 16, 2006

in Google/SEO

Heh. I wrote this hugely long post, so I pulled a Googler aside and asked “Dan, what do you think of this post?” And after a few helpful comments he said something like, “And, um, you may want to include a paragraph of understandable English at the top.” :)

Fair enough. Some people don’t want to read the whole mind-numbingly long post while their eyes glaze over. For those people, my short summary would be two-fold. First, I believe the crawl/index team certainly has enough machines to do its job, and we definitely aren’t dropping documents because we’re “out of space.” The second point is that we continue to listen to webmaster feedback to improve our search. We’ve addressed the issues that we’ve seen, but we continue to read through the feedback to look for other ways that we could improve.

People have been asking for more details on “pages dropping from the index” so I thought I’d write down a brain dump of everything I knew about, to have it all in one place. Bear in mind that this is my best recollection, so I’m not claiming that it’s perfect.

Bigdaddy: Done by March

- In December, the crawl/index team were ready to debut Bigdaddy, which was a software upgrade of our crawling and parts of our indexing.
- In early January, I hunkered down and wrote tutorials about url canonicalization, interpreting the inurl: operator, and 302 redirects. Then I told people about a data center where Bigdaddy was live and asked for feedback.
- February was pretty quiet as Bigdaddy rolled out to more data centers.
- In March, some people on WebmasterWorld started complaining that they saw none of their pages indexed in Bigdaddy data centers, and were more likely to see supplemental results.
- On March 13th, GoogleGuy gave a way for WMW folks to give example sites.
- After looking at the example sites, I could tell the issue in a few minutes. The sites that fit “no pages in Bigdaddy” criteria were sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling. The Bigdaddy update is independent of our supplemental results, so when Bigdaddy didn’t select pages from a site, that would expose more supplemental results for a site.
- I worked with the crawl/index team to tune thresholds so that we would crawl more pages from those sorts of sites.
- By March 22nd, I posted an update to let people know that we were crawling more pages from those sorts of sites. Over time, we continued to boost the indexing even more for those sites.
- By March 29th, Bigdaddy was fully deployed and the old system was turned off. Bigdaddy has been powered our crawling ever since.

Considering the amount of code that changed, I consider Bigdaddy pretty successful in that I only saw two complaints. The first was one that I mentioned, where we didn’t index pages from sites with less trusted links, and we responded and started indexing more pages from those sites pretty quickly. The other complaint I heard was that pages crawled by AdSense started showing up in our web index. The fact that Bigdaddy provided a crawl caching proxy was a deliberate improvement in crawling and I was happy to describe it in PowerPoint-y detail on the blog and at WMW Boston.

Okay, that’s Bigdaddy. It’s more comprehensive, and it’s been visible since December and 100% live since March. So why the recent hubbub? Well, now that Bigdaddy is done, we’ve turned our focus to refreshing our supplemental results. I’ll give my best recollection of that timeline too. Around the same time, there was speculation that our machines are full. From my personal perspective in the quality group, we have certainly have enough machines to crawl/index/serve web results; in fact, Bigdaddy is more comprehensive than our previous system. Seems like a good time to throw in a link to my disclaimer right here to remind people that this is my personal take.

Refreshing supplemental results

Okay, moving right along. As I mentioned before, once Bigdaddy was fully deployed, we started working on refreshing our supplemental results. Here’s my timeline:
- In early April, we started showing some refreshed supplemental results to users.
- On April 13th, someone started a thread on WMW to ask about having fewer pages indexed.
- On April 24th, GoogleGuy gave a way for people to provide specifics (WebmasterWorld, like many webmaster forums, doesn’t allow people to post specific site names.)
- I looked through the feedback and didn’t see any major trends. Over the next week, I gave examples to the crawl/index team. They didn’t see any major trend either. The sitemaps team investigated until they were satisfied that it had nothing to do with sitemaps either.
- The team refreshing our supplemental results checked out feedback, and on May 5th they discovered that a “site:” query didn’t return supplemental results. I think that they had a fix out for that the same day. Later, they noticed that a difference in the parser meant that site: queries didn’t work with hyphenated domains. I believe they got a quick fix out soon afterwards, with a full fix for site: queries on hyphenated domains in supplemental results expected this week.
- GoogleGuy stopped back by WMW on May 8th to give more info about site: and get any more info that people wanted to provide.

Reading current feedback

Those are the issues that I’ve heard of with supplemental results, and those have been resolved. Now, what about folks that are still asking about fewer pages being reported from their site? As if this post isn’t long enough already, I’ll run through some of the emails and give potential reasons that I’ve seen:

- First site is a .tv about real estate in a foreign country. On May 3rd, the site owner says that they have about 20K properties listed, but says that they dropped to 300 pages. When I checked, a site: query shows 31,200 pages indexed now, and the example url they mentioned is in the index. I’m going to assume this domain is doing fine now.

- Okay, let’s check one from May 11th. The owner sent only a url, with no text or explanation at all, but’s let’s tackle it. This is also a real estate site, this time about a Eastern European country. I see 387 pages indexed currently. Aha, checking out the bottom of the page, I see this:
Poor quality links
Linking to a free ringtones site, an SEO contest, and an Omega 3 fish oil site? I think I’ve found your problem. I’d think about the quality of your links if you’d prefer to have more pages crawled. As these indexing changes have rolled out, we’ve improving how we handle reciprocal link exchanges and link buying/selling.

- Moving right along, here’s one from May 4th. It’s another real estate site. The owner says that they used to have 10K pages indexed and now they have 80. I checked out the site. Aha:
Poor quality links
This time, I’m seeing links to mortgages sites, credit card sites, and exercise equipment. I think this is covered by the same guidance as above; if you were getting crawled more before and you’re trading a bunch of reciprocal links, don’t be surprised if the new crawler has different crawl priorities and doesn’t crawl as much.

- Some one sent in a health care directory domain. It seems like a fine site, and it’s not linking to anything junky. But it only has six links to the entire domain. With that few links, I can believe that out toward the edge of the crawl, we would index fewer pages. Hold on, digging deeper. Aha, the owner said that they wanted to kill the www version of their pages, so they used the url removal tool on their own site. I’m seeing that you removed 16 of your most important directories from Oct. 10, 2005 to April 8, 2006. I covered this topic in January 2006:

Q: If I want to get rid of domain.com but keep www.domain.com, should I use the url removal tool to remove domain.com?
A: No, definitely don’t do this. If you remove one of the www vs. non-www hostnames, it can end up removing your whole domain for six months. Definitely don’t do this. If you did use the url removal tool to remove your entire domain when you actually only wanted to remove the www or non-www version of your domain, do a reinclusion request and mention that you removed your entire domain by accident using the url removal tool and that you’d like it reincluded.

You didn’t remove your entire domain, but you removed all the important subdirectories. That self-removal just lapsed a few weeks ago. That said, your site also has very few links pointing to you. A few more relevant links would help us know to crawl more pages from your site. Okay, let’s read another.

- Somebody wrote about a “favorites” site that sells T-shirts. The site had about 100 pages, and now Google is showing about five pages. Looking at the site, the first problem that I see is that only 1-2 domains have any links at all to you. The person said that every page has original content, but every link that I clicked was an affiliate link that went to the site that actually sold the T-shirts. And the snippet of text that I happened to grab was also taken from the site that actually sold the T-shirts. The site has a blog, which I’d normally recommend as a good way to get links, but every link on the blog is just an affiliate link. The first several posts didn’t even have any text, and when I found an entry that did, it was copied from somewhere else. So I don’t think that the drop in indexed pages for this domain necessarily points to an issue on Google’s side. The question I’d be asking is why anyone would choose your “favourites” site instead of going directly to the site that sells T-shirts?

Closing thoughts

Okay, I’ve got to wrap up (longest. post. evar). But I wanted to give people a feel for the sort of feedback that we’re getting in the last few days. In general, several domains I’ve checked have more pages reported these days (and overall, Bigdaddy is more comprehensive than our previous index). Some folks that were doing a lot of reciprocal links might see less crawling. If your site has very few links where you’d be on the fringe of the crawl, then it’s relatively normal that changes in the crawl may change how much of your site we crawl. And if you’ve got an affiliate site, it makes sense to think about the amount of value-add that your site provides; you want to provide a reason why users would prefer your site.

In March, I was able to read feedback and identify an issue to fix in 4-5 minutes. With the most recent feedback, we did find a couple ways that we could make site: more accurate, but despite having several teams (quality, crawl/index, sitemaps) read the remaining feedback, we’re seeing more a grab-bag of feedback than any burning issues. Just to be clear, I’m not saying that we won’t find other ways to improve. Adam has been reading and replying to the emails and collecting domains to dig into, for example. But I wanted to give folks an update on what we were seeing with the most recent feedback.

- Sent using Google Toolbar"

ASUS Eee Pad EP101TC, EP121 and Eee Tablet get official - SlashGear

ASUS Eee Pad EP101TC, EP121 and Eee Tablet get official - SlashGear: "ASUS Eee Pad EP101TC, EP121 and Eee Tablet get official
By Chris Davies on Monday, May 31st 2010 1 Comment

Worth Reading?
NoYes

+36 [52 votes]

Computex 2010 doesn’t kick off in earnest until tomorrow, but a few of the big name exhibitors have snuck in ahead with some early press conferences today. ASUS are first out of the gate with their iPad competition, and they’re taking a three-pronged approach: the ASUS Eee Pad will be available in 10-inch (EP101TC) and 12-inch (EP121) variants, and offer Windows Embedded Compact 7 and Windows 7 Home Premium respectively, while the ASUS Eee Tablet is a monochrome digital notebook, packing 2,450dpi touchscreen input sensitivity and targeted at note-takers like students.

ASUS Eee Pad EP121 540x374

The ASUS Eee Pad EP101TC doesn’t have finalised specs as yet, but it measures just 12.2mm thick and weighs 675g. The Windows Embedded Compact 7 OS promises the same flexibility and interface as existing users are expecting, but there’ll be more cloud-integration too.

As for the ASUS Eee Pad EP121, that gets Windows 7 Home Premium for everything you’d expect a notebook to run, powered by an Intel Core 2 Duo CULV processor. That also means it’ll have the usual notebook connectivity, so expect USB and a webcam. ASUS have apparently also developed a keyboard dock, similar to what Apple offer for the iPad, for when you’re deskbound. Neither model is expected to hit the market until Q1 2011, with prices tipped at between $399 and $499.

Finally, the ASUS Eee Tablet is a good example of very specialised segment targeting. It uses a non-backlit monochrome LCD display, not E Ink, and as such as 0.1-second page transitions. There’s also a 2-megapixel camera, which ASUS envision students using to snap images of lecture slides and then annotate them with the high-resolution touchscreen and stylus. Sync is via USB or microSD card, there’s a battery good for up to 10hrs and various apps that give the impression of using a paper notebook – only one that you can search through more readily. The Eee Tablet is expected in September 2010, priced between $199 and $299.

ASUS Eee Pad EP101TC 150x100

ASUS Eee Pad EP121 150x100

ASUS Eee Tablet 150x100



Press Release:

Stay Connected and Multitask with the Eee Pad

Engineering excellence meets stunning design in the ASUS Eee Pad, an ultra-slim and light yet high-performance slate device designed to provide users with a real time cloud computing experience. The Eee Pad will be available in two configurations.

The 12″ Eee Pad EP121 is a full-featured slate computer that serves as a multimedia player, e-reader, and compact computing device. Powered by a CULV Intel® Core™ 2 Duo processor and the Windows® 7 Home Premium operating system, it effortlessly handles multitasking tasks whether enabling users to check their email and calendars, have video conferences, or process Microsoft Word and Excel documents simultaneously. The ASUS Eee Pad EP121 offers two convenient modes of character input-an embedded virtual keyboard or an innovative hybrid keyboard/docking station design. All of this power is available in a personal computing device that delivers up to 10 hours of usage.

For users seeking additional mobility, ASUS is proud to present the 10” Eee Pad EP101TC that runs Windows Embedded Compact 7, which provides an engaging user experience and delivers instant connectivity to the Windows world. It also provides a familiar full-featured user experience across various connected devices and cloud computing services.

Press Release:

ASUS Provides Tomorrow’s Technologies Today at Computex 2010

Innovative Eee Pad and Eee Tablet extend ASUS’ leadership in Cloud Computing

ASUS’ leadership in innovation and design will once again be the focus at Computex 2010 in Taipei, Taiwan. ASUS will proudly showcase a wide range of products across five major categories: cloud computing, gaming, enthusiast-level PC components, multimedia and green computing. As a technological leader in cloud computing, ASUS offers a broad lineup of cloud-connected devices featuring on-the-fly data and multimedia sharing capabilities that consumers crave in today’s market.

The Notepad Goes Digital with the Eee Tablet
Innovation meets cloud computing at Computex 2010 with the ASUS Eee Tablet. With a 2450 dpi touch resolution screen, the Eee Tablet is one of the world’s most accurate and sensitive digital note taking devices, and gives the user the feel of writing on paper. Users can select one of the built-in notepad templates and have the option to store, sort and tag, organize or browse through them. Real time text annotations can also be made on-the-fly. The Eee Tablet makes reading easy, with text file page turns taking just 0.1 seconds-nine times faster than the page turns of normal e-readers. Reading documents or books remains easy on the users’ eyes even after prolonged viewing periods.

The ASUS Eee Tablet features a built-in 2 megapixel camera that captures detailed images, letting the user grab screenshots of lecture slides and write notes on them instantly. It easily syncs up with a PC or notebook via USB or Micro SD to ensure that all notes, content, and calendars are constantly kept up-to-date. With up to 10-hours of battery life, the Eee Tablet has enough power for a variety of tasks. At the end of the day, users not only have an electronic notepad, but a media player and e-reader as well.



- Sent using Google Toolbar"

Nathan's Economic Edge: Morning Update/ Market Thread 5/28

Nathan's Economic Edge: Morning Update/ Market Thread 5/28: "Morning Update/ Market Thread 5/28
Good Morning,

Equity futures are flat following yesterday's 284 point jaunt. That was a ridiculous 97% up day by volume on the NYSE. Now, normally I would consider that a “panic” buying day, but it came on much lower volume. Remember, volume confirms price, and volume hasn’t confirmed any direction but down for the past 3 years.

And last night it dawned on me what’s occurring is not the same thing as what used to occur in the good old days of 90ish percent up or down days… Of course you know that the market is extremely volatile, but we’ve had SIX 90%+ down days in the past few weeks to go along with another couple 90%+ up days (no follow-through on the upside, only on the downside). These have not just been 90%, they’ve been in the high 90’s. Now combine that fact with the fact that four banks had perfect trading quarters… the conclusion is that HFT (High Frequency Trading) has become so good and so fast at chasing short term momentum that all the machines get on the same side of the trade way faster and more efficiently than they used to. I mean, come on – 97% of all the trading volume was long yesterday, that’s simply absurd. Was mom and pop going long because we finally broke back above 1,090? Hardly, these lopsided days are yet another sign of how HFT and a very few players have taken over the markets. I reiterate; the markets no longer exist, they are a computer simulation.

- Sent using Google Toolbar"

Gmail has become unusably slow - Gabriel Weinberg's Blog

Gmail has become unusably slow - Gabriel Weinberg's Blog: "Gmail has become unusably slow
By
Gabriel Weinberg
on May 28, 2010 8:35 AM
When I switched to Gmail in 2004, I believed the hype. Never delete a message again--no need. We have tons of space, and you can search it all really fast like Google.

That time has passed. Gmail has gotten slower and slower for me, and as of the last few weeks it has become unusably slow. Before you ask, yes, I've tried it across lots of browsers and computers.

It can take 20sec to switch labels, and even longer to search for something. But here's the worst part--it takes just as long to send a simple message!?! Why? What does sending have to do with anything?

It's become the bottleneck in my day, and I don't know what to do about it. And I'm not alone.

A few days ago I decided to start taking action. First I emailed support. OK, first, I tried to email support.

Have you ever tried to email Google support? It's almost impossible to find the contact form. Here's the support home page. I dare you to find out where to report this slowness issue.

You get to this page on slowness. After going through the wizard, you click on 'report your issue' at the bottom, and it takes you here. Wait, that's not a contact form, and you can't get to one from that page! Anyway, here is a contact form; I found it going through another problem wizard.

Needless to say, I haven't heard a response :)

Next step: I disabled chat, buzz & tried the older versions of Gmail. No luck. Then I disabled all labs, after which I perceived a very modest improvement, but still unusable.

Next I removed most of my labels. I have four now (down from 32). This seemed to help a bit as well, but still not much.

So this morning I went drastic. I deleted all my contacts and started deleting mail. Ridiculous huh? That totally breaks the original selling point of Gmail, but like I said I'm at wits end here.

Deleting stuff has resulted in the biggest improvement so far, but it's still slow. Perhaps a bit better than unusable now, but still terrible.

You are currently using 4247 MB (56%) of your 7459 MB.

In a last ditch effort, I bought some extra storage from Google thinking maybe I'd get some kind of premium level service. So far, no.

Google's been recently launching lots of cloud products, most recently a storage product to compete with Amazon's S3.

In other words, they obviously have the resources to make Gmail fast. So what's the deal? They must know about the slowness. The only reasonable explanation is that they are consciously under-resourcing it. Again, why?

Lame! (cut from South Park's AWESOM-O)

Update: there are also a lot of good comments on HN.

Update2: after a bunch of testing with my account, I'm confident at least my slowness involves something around having more than 4GB of mail. I deleted a lot of messages and got down to 3.6GB. It was then relatively fast again. I then sent myself a 25MB file (the limit) repeatedly until I got back up to 4GB. Right after 4GB, it got slow. Go figure.
Follow me on Twitter. Or in your feed reader. Or by email.



- Sent using Google Toolbar"

Ballmer just opened the Second Envelope | Monday Note

Ballmer just opened the Second Envelope | Monday Note: "Ballmer just opened the Second Envelope
May 30, 2010 - 8:45 pm | Edited by Jean-Louis Gassée

You know the business lore joke. The departing CEO meets his successor and hands him three envelopes to be opened in the prescribed order when trouble strikes. First crisis, the message in envelope #1 says: Blame your predecessor. Easy enough. Another storm, the the CEO opens the second envelope: Reorganize. Good idea. And when calamity strikes yet again, he reaches for the third: Get three envelopes…

This past Tuesday, Steve Ballmer reorganized Microsoft’s Entertainment & Devices division, let go of its execs, Robbie Bach and J Allard, and moved a few more pieces around. All wrapped in the most mellifluous, Orwellian language we’ve seen from Microsoft in awhile. The full memo is here. We’re treated to encomiums to great work, friendship, spending more time with one’s family, leaving on a high note…under the guise of decency, this is indecent.
Ballmer’s view of executive leadership doesn’t admit standing up and taking responsibility. He can’t say ‘I screwed up’ and then explain what he’ll do to rectify the situation. No. Instead, two gents are fingered while they pretend they aren’t being blamed. In a surreal, a cappella farewell memo, J Allard writes to his soon former troops:
‘No one can touch our talent, our impact or our ambition. We’re the only high-tech company with the track record and self-confidence to reinvent ourselves as we have. If you want to change the world with technology, this is still the best tribe out there.’

Robbie Bach dutifully plays his part in the down-is-actually-up corporate farce. He gives a long exit interview to the Microsoft-friendly blog TechFlash where he claims the dual departures are coincidental, that everything is fine. What does he have to say about tablets? Nothing much:
‘Well, tablet is an area that will evolve going forward. Certainly it’s a focus for what we’re doing in the Windows space, and how they’re thinking that space. We’re going to have a bunch of netbooks and tablet stuff that’s in the works there. We’ll just see how that evolves. I don’t think there’s anything earth-shattering about that. It’s just another set of devices, and we’ll figure out how we make sure we bring a good offering to consumers.’
And, regarding the now defunct Courier tablet:
‘Courier, first of all, wasn’t a device. The project and the incubation and the exploration we did on Courier I view as super important. The “device” people saw in the video isn’t going to ship, but that doesn’t mean we didn’t learn a bunch and innovate a bunch in the process. And I’m sure a bunch of that innovation will show up in Microsoft products, absolutely confident of it.’
Serves us right for not reading the small print on the screen during the demo. These guys obviously think we’re idiots. That’s their privilege, but they ought to be a little more discrete about their low regard for us.

Not everyone buys this BS. One blogger, Horace Dediu, offers what many believe is the right explanation: Robbie Bach was fired because he lost the HP account. As the largest PC maker, HP is a hugely important Microsoft customer. A few weeks ago, HP acquired Palm for its WebOS smartphone software platform. The slap in Microsoft’s face still resonates; it’s a verdict on the failed Windows Mobile offering and a negative prognosis on its upcoming Windows Phone 7 Series operating system for smartphones. Days after the acquisition, Mark Hurd, HP’s CEO, let it be known that WebOS will be used in connected printers. As a final blow, HP’s (future) Slate Tablet, once held high as a Windows 7 device, will also use Palm’s WebOS.

Steve Ballmer has always been Microsoft’s most powerful salesman. That he lost the HP mobile devices account—and it was Ballmer who lost it, not Robbie Bach—is yet one more reason why Microsoft shareholders are troubled. Their unhappiness can be charted by comparing two stock price graphs, spanning the January 2000 - May 2010 period. Microsoft’s stock dropped from $56 to $25.80…

…while Apple shares rose from $25 to $256.88:

The morning after Steve Ballmer opened the proverbial Second Envelope, Apple’s market cap, the total value of its shares, surpassed Microsoft’s. In Wall Street terms, Apple is now the largest high-tech company, worth about $230B, a few percentage points ahead of Microsoft. Across all industries, Steve Jobs’ company is now second only to an oil company, Exxon, at $285B. When questioned about Apple overtaking Microsoft, Ballmer had this to say:
‘It is a long game. We have good competitors but we too are very good competitors,’ he said. ‘I will make more profit and certainly there is no technology company on the planet that is as profitable as we are.’

When it comes to profits, Ballmer is willing to take credit.

Over the last decade, Wall Street has declined to reward Microsoft for its superior profit. The explanation is simple: Professional investors don’t believe Ballmer, and they don’t see bigger profits in Microsoft’s future. Conversely, they bid up Apple’s shares precisely because they think the company will keep growing revenue and profits. Apple has managed to enter new, growing markets, a feat Ballmer has repeatedly failed to accomplish.

January 2000 was when Steve Ballmer was made CEO of Microsoft. Yes, we can discount the year 2000, that’s when the Internet Bubble burst causing most high-tech shares to collapse. Still, since the end of 2000, Microsoft stock has stagnated, hovering between $25 and $30.
This never appeared to faze Ballmer. While we joke about the Steve Jobs Reality Distortion Field, Microsoft shareholders ought to worry about Steve Ballmer’s own distortion, and about the self-inflicted effects of such a strong field.
We all remember Vista, it was a godsend for Apple. Did Ballmer acknowledge that there were problems? What about the Xbox 360 reliability nightmare? The apologies were left to underlings. Then Google comes out of nowhere to take 65% of the Search market, leaving Microsoft with an Apple-like market share (I’m referring to Macs, not iPods). In MP3 players, Microsoft failed again and again in its attempts to unseat the dominant (65% market share) iPod/iTunes combo. Social networks? A tiny investment (1.6%) in Facebook. And where is Microsoft in the Microblogging world, a.k.a Twitter? Nowhere, the old Microsoft Messenger is fading away.

Now we have the most recent Microsoft failure: Smartphones. When the iPhone came out in 2007, Ballmer pronounced it a “passing fad’’. Then, in 2008, he promised that Windows Mobile would have 40% market share by 2012. To be fair, he did recognize the failure in October 2009: ‘[We] screwed up with Windows Mobile.’ The platform was effectively dead. Earlier this year in Barcelona, Ballmer introduced his new smartphone OS, Windows Phone 7 Series, available in time for the Holidays. Two months after the Barcelona event, two Kin phones emerge to a lukewarm reception…and neither of them run on the old generation software, nor the next one, orphaned at birth.

Undeterred, Ballmer now predicts there will be 30 million Windows Phone 7 smartphones sold in 2011. Ballmer has proudly proclaimed there will be no iPod or iPhone in his household, so that’s a few Windows 7 Phones sold right there. As for the rest of the 30 million… has he heard of Android? At last week’s Google I-O, the company announced there were over 100,000 Android phones activated every day, more than 35 million Android phones this year. Given the enthusiasm of handset makers for Google’s OS, they’ll probably sell twice as many next year. Ballmer’s Reality Distortion Field is overheating.

There’s no dearth of advice for Ballmer to right the ship. You can find Galen Gruman’s at Infoworld and Anders Bylund’s at the Motley Fool. But I’m afraid none of it will work. The same leadership will cause the same effects. Ballmer is running out of envelopes.

One of Microsoft’s problems, paradoxically, is that it makes a lot of money. It can spend 15% of revenue in R&D—about $9B a year—with no market breakthrough to show for it. Great concept demos and prototypes… and then nothing. (How many new Googles, Facebooks, and Twitters could we VCs fund with that kind of money?…)

A bigger problem is Microsoft’s Board of Directors. Ten women and men with distinguished backgrounds ranging from banking to pharma, from university president to venture capitalist. (There’s a lonely entrepreneur there, Reed Hastings, CEO of Netflix. He’s a math whiz, which could explain the NetFlix prize.)
Three out of the ten members are old-time friends, connected through Harvard (Gates and Ballmer) and Stanford (Ballmer and Marquardt). Such closeness makes it difficult to make painful decisions. Furthermore, it’s not obvious how a research mathematician, the President of Harvey Mudd College—a terrific place for gifted kids—an auto executive, and a banker can parse the finer but essential points of a mobile software strategy. The PowerPoint slides look professional, the occasional demo looks good… but what can a “generalist’’ director do?

In theory, the directors’ most important function is to hire and fire the CEO. But how independent are the Microsoft directors? How could they get the CEO to open the third envelope?

Microsoft didn’t have Apple’s stroke of luck. Fire one if its founders who goes on to start two companies, Pixar and NeXT, and then comes back twelve years later, tempered by the experiences, good and bad, ready to lead the company to an amazingly successful second act. Except for Ballmer’s two-year stint at Procter & Gamble, all he and Gates have ever known is Microsoft.

So, there we are. An immensely successful company, still making large amounts of money but unable to go beyond its original Windows + Office + Exchange franchise, left behind by a combination of newcomers such as Google and Facebook with the old frenemy, Apple.

Someday, a large institutional holder will get tired of waiting, tired of watching yet another rah-rah, ‘the future will be great’ speech from Ballmer, and they’ll dump their shares. That might shock the Board into taking the required drastic action. Take a look at the institutional ownership of Microsoft stock. Many have already started selling portions of their holdings.

Next week in Los Angeles we have the Wall Street Journal’s D8 conference (the speaker line-up is here). Steve #1 and Steve #2 will be interviewed on stage by Walt Mossberg, the newspaper’s high-tech guru. We’ll see how Walt broaches the Second Envelope question with Steve #2. Will he pose hard questions or toss softballs, allowing Ballmer to give one more of his now tired and tiring ‘We’ll win…eventually’ speeches?

Come to think of it, Steve Numero Uno might very well heap praise on Microsoft’s CEO, he likes him just the way he is: with enemies like Ballmer, who needs friends?

I’ll be in attendance, and will report back next week.



- Sent using Google Toolbar"

Make - IR thermal imaging cameras

Make - IR thermal imaging cameras: "Makers and Making: IR thermal imaging cameras

1.

1 to 15 of 15
c0redump
May 15th 2007
I've been reading the IET journal article on infra-red thermal imaging cameras, and their medical applications. It mentions that a thermal imaging camera costs about £20,000 (about $40,000?)! Now, my question is this: why so expensive? Is it something to do woth the low volume of sales? Or is there some part or parts that really puts the price up?

Way back in the 1970s, I saw a thermal imaging camera that could show variations in body temperature. It showed my glasses (actual glass lenses back then) as black, opaque to IR. The lens in the camera was made of germanium -- yes, a shiny metal lens! Is that the reason for the high cost? Do modern IR cameras still have germanium lenses?

Hope somebody who's more familiar with IR cameras can answer this! Would be great to make a thermal imager, but it looks like a tricky problem if glass lenses can't be used.

John Honniball
Myself248
May 15th 2007
The microbolometer in a thermal camera is incredibly difficult to manufacture, but not $40,000 difficult. The trick is that since they're seen as high-end devices, any sensors with flaws are discarded, instead of built into cheap cameras. If they were a consumer-grade item, more defects would be acceptable and manufacturing would get cheaper. It's a chicken-and-egg problem.

Anyway, prices are coming down, you can get a very nice thermal camera for $5,000 or so now. Do some research, if you come up with a DIY method I'd be interested!
cajunfj40
May 16th 2007
Hello c0redump,

Myself248 is essentially correct regarding high-cost medical thermal imaging cameras. Low volumes, no defect tolerance, very high quantities of FDA red tape to qualify the device, insurance against lawsuits if it fails to see something, etc. all lead up to a very high end-user price.

What is your application? If you just want to see 'hot spots' and don't need the 'false color' image, you can take the IR filter off of a B&W video camera and 'see' IR. Google IR and B&W camera and you'll find oodles of links. Some camera sensors are more sensitive in the IR range than others, so look around to see what others have played with and had good results with. You may even be able to find a filter that blocks everything except IR, which would effectively increase the sensitivity of the sensor - it won't be blinded by bright lights, just the IR portion of that light. To do a quick check to see if a camera can see IR, flash a remote control into the lens while you watch the monitor output. If you can see the flash, the sensor can see IR.

If you're looking at medical applications, that's a whole other ballgame. I work in R&D designing cardio and neuro stimulation leads and adapters, and the amount of testing and red tape we need to present a case to the FDA that our device is 1) safe, 2) effective 3) better than the currently available stuff is quite expensive. I've got a lead design sitting in a tester that's over 270 million flex cycles, and it has to reach 400 million before it is acceptable. at 15hz, that's over a year of continuous testing. The tester, a Bose Enduratec (and you thought they just built speakers!) was not cheap, and our time to administer the test is not cheap either.

If you want a case study for why the medical products industry is and should be paranoid about testing, and why we charge so much (because that testing is expensive) Google the Therac-25. A software fault caused at least 6 known extreme overdoses of radiation therapy. A thermal imaging camera isn't in the same class - it does not emit stuff at the patient - but the idea is the same, in that if it harms or fails to detect harm to a patient the manufacturer may be held liable.

Have fun!
-cajun
HiroProtagonist
May 16th 2007
cajunfj40 sez: 'If you just want to see 'hot spots' and don't need the 'false color' image, you can take the IR filter off of a B&W video camera and 'see' IR. Google IR and B&W camera and you'll find oodles of links.'

This won't help the original poster if he's truly looking for IR thermal. Thermal imaging uses long wave IR. Normal camera sensors are sensitive to short wave IR, but this won't show thermal 'hot spots'.

c0redump: I saw an article on the web a while ago about a guy who made his own thermal imaging camera for under $100. He used the polymer sensors from PIR detectors - it was *very* low-res though. Unfortunately I couldn't find the article with a quick google, but it may still be out there.
machinemaker
May 17th 2007
Not sure if this helps, but I used to use a thermal imaging camera for trouble shooting industrial wiring. It was a camera made by Fluke the same folks that make multimeters. It worked great for finding loose connections, bad breakers, failing components. When you took images of people you could see quite a bit of difference in surface temperatures and it was a couple grand. However, Fluke products are a bit pricey and it had software for comparing images etc. I have to believe that one could build it cheaper.
Mysterio101
Oct 13th 2008
You can make a simple thermal imaging camera with $50 or maybe even less. it doesn't have an LCD though, it's kinda like a 'mechanical TV'. I'll tell you more later.

Bye all!
c0redump
Nov 30th 2008
@Mysterio101: yes, please do tell us more about this technique!
alankilian
Nov 30th 2008
One of the guys in the Twin Cities Robotics Group (tcrobots.org) works on
the Fluke IR cameras. They are down to $4,500 now according to this page:
http://www.transcat.com/catalog/productdetail.aspx?itemnum=98589TE

Sometimes he bring a test camera to a meeting, and it's great fun drawing
words on the wall with your finger and seeing them with the camera.
Huisjen
Dec 8th 2008
I'm getting into the home energy auditing business. I just bought a Flir/Extech i5 camera for the job. Its about $3000. It has an 80x80 pixel array, which isn't super high resolution, but is good enough to see the swath of cold left by air leaking around doors, windows, and sill plates.
wbeaty
Dec 15th 2008 edited
The lens would be the expensive part!

Hundred-pixel thermal camera $10:
http://web.archive.org/web/20070203225204/http://users.bestweb.net/~hobbs/footprints/fpspie11.pdf


Phil Hobbs' project uses ferroelectric/pyroelectric polyvinylidene PDVF film, the same sensor in IR motion burglar alarms and security floodlights.. To avoid buying 1000 gigohm resistors and to improve diode muxed switching, he uses light-biased red LEDs as the diodes.

Note that pyro plastic film only responds to temperature *changes.* Either it only can see moving sources (a thermal 'frog's eye' effect,) or the camera needs a rotating chopper wheel.

I don't think he mentions the lens. Pinhole? Polyethelene fresnel lens? Telescope mirror?
cashsale
Dec 17th 2008
Seems like something useful could be created with a single pixel of IR-sensitive material and a mechanical mirror/voice-coil scanning arrangement. Sensitivity would be horrible, but a tripod might allow one to produce a fairly high resolution image given enough time - seconds, minutes?. There would be some averaging as a result of the exposure time.

Yet another idea not suitable for the original poster, but a thought nonetheless.
MarissaLamb
Feb 2nd 2009 edited
The price of the technology has come down a ton. We are an HVAC company and shopped several cameras in the FLIR line. Overall we liked the FLIR B200 and FLIR B250 the best but also considered the FLIR B40 FLIR B50 and FLIR B60.

Ultimately we liked the ability to add aditional lenses with the B250 even though the B60 had a higher pixel count.
jschuch
Feb 6th 2009
If I were going to tackle this, I'd start with one of the inexpensive IR Thermometers that Harbor Freight is selling. Figure out how to get a signal out of the thing, maybe put a pinhole over the sensor to tighten up the dot size, and then build some sort of mechanical scanner. The scan speed would depend on the responsiveness of the sensor.

Would be a fun project.

http://www.harborfreight.com/cpi/ctaf/displayitem.taf?Itemnumber=96451

John
chaz_p
Aug 1st 2009
what do people think to these are they the correct lens or am going down the wrong route?

http://www.rapidonline.com/Electronic-Components/Sensors/Thermal-Sensors/TPA81-Thermopile-array-sensor/77927/kw/78-0792?source=googleps&utm_source=googleps
i really want a thermal imager to hunt with but dont want to pay thousands
ttgrthomas
Aug 4th 2009
One rotor simple-radio-controlled or telemetry-controlled helicopters are commercially available.
These Search & Rescue and Security helicopters usually are equipped with a very good digital camera.
Sadly though, far fewer UAV helicopters currently have IR Infra-red heat imaging cameras onboard.

[ quote from a recent makezine commentator ] -
'The trick is that since they're seen as high-end devices, any sensors with flaws are discarded, instead of built into cheap cameras. If they were a consumer-grade item, more defects would be acceptable and manufacturing would get cheaper. It's a chicken-and-egg problem.
Anyway, prices are coming down, you can get a very nice thermal camera for $5,000 or so now. Do some research, if you come up with a DIY method I'd be interested!' [ End Quote ]

So, on that note [ immediately above ], why not design, build and offer for sale very reliable, foolproof radio or telemetry operated **** one rotor UAV camera helicopters **** for Search & Rescue and Security purposes?
I cannot yet find any Extended Range versions with a fuel cell and compressed, metal-hydride or liquid hydrogen options that are for sale to us technologically hampered civilians.

Once done, you'll sell many THOUSANDS worldwide. Afterwards, maybe then you'll have enough of a customer base so that you can contact all your clients
& offer a ' consumer ' quality version of the IR Infra-red heat imaging camera .... for Search & Rescue and Security use.

Remember, limited range [ paltry poor flight duration ] is currently the main limiting factor in [ common civilian ] UAV Camera Drone technology today.

Doesn't have to be that way though .......

Me, I want one UAV at ' elevation ' immediately above me with a telemetry operated telephoto lense onboard it that's pointed at the other Extended Range UAV Helicopter up to 10 miles downrange. This way, a constant 'line of sight '
is maintained between the UAV hovering almost immediately above me and the one that's out there up to 10 miles radius from my vehicle.

With this very interesting strategy, I'm guessing that I'm technically ' legal ' with current FCC rules on using unpermitted Unmanned Aerial Vehicles. Right? The rule says that a ' line of sight ' must be maintained at all times.

Any questions? Find a very rare copy of 'Electric Car' magazine from 1995. Alan Cocconi at AC Propulsion in San Dimas, California was featured in this low-circulation ' market test ' magazine about electric vehicles. Call www.acpropulsion.com . Read all about the ' So Long ' UAV and Alan's experience with UAV camera drones in the 1980's & 1990's.

While you're at it, pick up a 248 [ maximum ] horsepower Electric Vehicle Motor with the necessary PWM / MOSFET power control electronics. Price? A mere $27,000. Best of the best for sure!

This powerful electric vehicle motor is currently installed in the www.acpropulsion.com T-Zero sports car and the new Tesla electric sports car at www.telsamotors.com

Bring me onboard in your company if I can spill any more profitable beans.

Employment is very scarce for those with less than a proper 4-year degree right now ....

480.528. 0 6 3 2, Mr. Thomas

- Sent using Google Toolbar"

Web Start-Ups Making Deals for Users’ Private Data - NYTimes.com

Web Start-Ups Making Deals for Users’ Private Data - NYTimes.com: "Web Start-Ups Offer Bargains for Users’ Data
By STEPHANIE CLIFFORD
Published: May 30, 2010

* Sign in to Recommend
* Twitter
* Sign In to E-Mail
* Print
*
Reprints
* ShareClose
o Linkedin
o Digg
o Facebook
o Mixx
o MySpace
o Yahoo! Buzz
o Permalink
o

As concern increases in Washington about the amount of private data online, and as big sites like Facebook draw criticism that they collect consumers’ information in a stealthy manner, many Web start-ups are pursuing a more reciprocal approach — saying, in essence: give us your data and get something in return.
Enlarge This Image
Peter DaSilva for The New York Times

Aaron Patzer, Mint’s founder, in 2008. He said his idea was simple: “We would data-mine your own data in order to help you.”
Related

*
Sam’s Club Personalizes Discounts for Buyers (May 31, 2010)
*
Times Topic: Privacy

Enlarge This Image

WeShop.com, still in development, has built a system that allows people to spread information about their shopping habits.

The budgeting Web site Mint.com, for example, displays discount offers from cable companies or banks to users who reveal their personal financial data, including bank and credit card information. The clothing retailer Bluefly could send offers for sunglasses to consumers who disclose that they just bought a swimsuit. And location-based services like Foursquare and Gowalla ask users to volunteer their location in return for rewards like discounts on Pepsi drinks or Starbucks coffee.

These early efforts are predicated on a shift in the relationship between consumer and company. Influenced by consumers’ willingness to trade data online, the sites are pushing to see how much information people will turn over.

- Sent using Google Toolbar"

Gulf Oil Spill: Massive Underwater Plumes Spell Disaster, Scientists Say

Gulf Oil Spill: Massive Underwater Plumes Spell Disaster, Scientists Say: "- Sent using Google Toolbar"


What's Your Reaction?
Gulf Oil Plumes

NEW ORLEANS — Independent scientists and government officials say there's a disaster we can't see in the Gulf of Mexico's mysterious depths, the ruin of a world inhabited by enormous sperm whales and tiny, invisible plankton.

Researchers have said they have found at least two massive underwater plumes of what appears to be oil, each hundreds of feet deep and stretching for miles. Yet the chief executive of BP PLC – which has for weeks downplayed everything from the amount of oil spewing into the Gulf to the environmental impact – said there is "no evidence" that huge amounts of oil are suspended undersea.

BP CEO Tony Hayward said the oil naturally gravitates to the surface – and any oil below was just making its way up. However, researchers say the disaster in waters where light doesn't shine through could ripple across the food chain.

"Every fish and invertebrate contacting the oil is probably dying. I have no doubt about that," said Prosanta Chakrabarty, a Louisiana State University fish biologist.

On the surface, a 24-hour camera fixed on the spewing, blown-out well and the images of dead, oil-soaked birds have been evidence of the calamity. At least 20 million gallons of oil and possibly 43 million gallons have spilled since the Deepwater Horizon drilling rig exploded and sank in April.

That has far eclipsed the 11 million gallons released during the Exxon Valdez spill off Alaska's coast in 1989. But there is no camera to capture what happens in the rest of the vast Gulf, which sprawls across 600,000 square miles and reaches more than 14,000 feet at its deepest point.

Every night, the denizens of the deep make forays to shallower depths to eat – and be eaten by – other fish, according to marine scientists who describe it as the largest migration on earth.

In turn, several species closest to the surface – including red snapper, shrimp and menhaden – help drive the Gulf Coast fishing industry. Others such as marlin, cobia and yellowfin tuna sit atop the food chain and are chased by the Gulf's charter fishing fleet.

Many of those species are now in their annual spawning seasons. Eggs exposed to oil would quickly perish. Those that survived to hatch could starve if the plankton at the base of the food chain suffer. Larger fish are more resilient, but not immune to the toxic effects of oil.

Story continues below

The Gulf's largest spill was in 1979, when the Ixtoc I platform off Mexico's Yucatan peninsula blew up and released 140 million gallons of oil. But that was in relatively shallow waters – about 160 feet deep – and much of the oil stayed on the surface where it broke down and became less toxic by the time it reached the Texas coast.

But last week, a team from the University of South Florida reported a plume was headed toward the continental shelf off the Alabama coastline, waters thick with fish and other marine life.

The researchers said oil in the plumes had dissolved into the water, possibly a result of chemical dispersants used to break up the spill. That makes it more dangerous to fish larvae and creatures that are filter feeders.

Responding to Hayward's assertion, one researcher noted that scientists from several different universities have come to similar conclusions about the plumes after doing separate testing.

No major fish kills have been reported, but federal officials said the impacts could take years to unfold.

"This is just a giant experiment going on and we're trying to understand scientifically what this means," said Roger Helm, a senior official with the U.S. Fish and Wildlife Service.

In 2009, LSU's Chakrabarty discovered two new species of bottom-dwelling pancake batfish about 30 miles off the Louisiana coastline – right in line with the pathway of the spill caused when the Deepwater Horizon burned and sank April 24.

By the time an article in the Journal of Fish Biology detailing the discovery appears in the August edition, Chakrabarty said, the two species – which pull themselves along the seafloor with feet-like fins – could be gone or in serious decline.

"There are species out there that haven't been described, and they're going to disappear," he said.

Recent discoveries of endangered sea turtles soaked in oil and 22 dolphins found dead in the spill zone only hint at the scope of a potential calamity that could last years and unravel the Gulf's food web.

Concerns about damage to the fishery already is turning away potential customers for charter boat captains such as Troy Wetzel of Venice. To get to waters unaffected by the spill, Wetzel said he would have to take his boat 100 miles or more into the Gulf – jacking up his fuel costs to where only the wealthiest clients could afford to go fishing.

Significant amounts of crude oil seep naturally from thousands of small rifts in the Gulf's floor – as much as two Exxon Valdez spills every year, according to a 2000 report from government and academic researchers. Microbes that live in the water break down the oil.

The number of microbes that grow in response to the more concentrated BP spill could tip that system out of balance, LSU oceanographer Mark Benfield said.

Too many microbes in the sea could suck oxygen from the water, creating an uninhabitable hypoxic area, or "dead zone."

Preliminary evidence of increased hypoxia in the Gulf was seen during an early May cruise aboard the R/V Pelican, carrying researchers from the University of Georgia, the University of Mississippi and the University of Southern Mississippi.

An estimated 910,000 gallons of dispersants – enough to fill more than 100 tanker trucks – are contributing a new toxin to the mix. Containing petroleum distillates and propylene glycol, the dispersants' effects on marine life are still unknown.

What is known is that by breaking down oil into smaller droplets, dispersants reduce the oil's buoyancy, slowing or stalling the crude's rise to the surface and making it harder to track the spill.

Dispersing the oil lower into the water column protects beaches, but also keeps it in cooler waters where oil does not break down as fast. That could prolong the oil's potential to poison fish, said Larry McKinney, director of the Harte Research Institute at Texas A&M University-Corpus Christi.

"There's a school of thought that says we've made it worse because of the dispersants," he said.