Friday, December 28, 2012

Your Daily digest for Tech Geek`s Tools, Tips, Tricks and Tutorials

Tech Geek`s Tools, Tips, Tricks and Tutorials
Pipes Output
Michigan Makes It Illegal To Ask For Employees' Facebook Logins
Dec 28th 2012, 23:30

An anonymous reader writes "Michigan joins Maryland as a state where employers may not ask employees or job applicants to divulge login information for Facebook and other social media sites. From the article: 'Under the law, employers cannot discipline employees or decline to hire job applicants because they do not give them access information, including user names, passwords, login information, or "other security information that protects access to a personal internet account," according to the bill. Universities and schools cannot discipline or fail to admit students if they do not give similar information.' There is one exception, however: 'However, accounts owned by a company or educational institution, such as e-mail, can be requested.'"

Share on Google+

Read more of this story at Slashdot.



How ISPs Collude To Offer Poor Service
Dec 28th 2012, 23:06

alexander_686 writes "Bloomberg is running a series of articles from Susan Crawford about the stagnation of internet access in the U.S., and why consumers in America pay more for slower service. Quoting: 'The two kinds of Internet-access carriers, wired and wireless, have found they can operate without competing with each other. The cable industry and AT&T-Verizon have divided up the world much as Comcast and Time Warner did; only instead of, "You take Philadelphia, I'll take Minneapolis," it's, "You take wired, I'll take wireless." At the end of 2011, the two industries even agreed to market each other’s services.' I am a free market type of guy. I do recognize the abuse that can come from natural monopolies that utilities tend to have, but I have never considered this type of collusion before. To fix the situation, Crawford recommends that the U.S. 'move to a utility model, based on the assumption that all Americans require fiber-optic Internet access at reasonable prices.'"

Share on Google+

Read more of this story at Slashdot.



Facebook Paid 0.3% Taxes On $1.34 Billion Profits
Dec 28th 2012, 22:14

theodp writes "Facebook is unlikely to make many new (non-investor) friends with reports that it paid Irish taxes of about $4.64 million on its entire non-U.S. profits of $1.344 billion for 2011. 'Facebook operates a second subsidiary that is incorporated in Ireland but controlled in the Cayman Islands,' Kenneth Thomas explains. 'This subsidiary owns Facebook Ireland, but the setup allows the two companies to be considered as one for U.S. tax purposes, but separate for Irish tax purposes. The Caymans-operated subsidiary owns the rights to use Facebook's intellectual property outside the U.S., for which Facebook Ireland pays hefty royalties to use. This lets Facebook Ireland transfer the profits from low-tax Ireland to no-tax Cayman Islands.' In 2008, Facebook COO Sheryl Sandberg cited 'local world-class talent' as the motivation behind Facebook's choice of tax-haven Dublin for its international HQ. Similar tax moves by Google, Microsoft, and others who have sought the luck-of-the-Double-Irish present quite a dilemma for tax revenue-seeking governments. Invoking Supreme Court Justice Potter Stewart's famous common sense definition of ethics ('Ethics is knowing the difference between what you have a right to do and what is right to do') is unlikely to sway corporations whose top execs send the message that tax avoidance is the right thing to do and something to be proud of."

Share on Google+

Read more of this story at Slashdot.



Instagram User Drop Claims Overblown
Dec 28th 2012, 21:23

Nerval's Lobster writes "When AppData first posted a graph showing a 25 percent drop in Instagram's daily active users, it sparked a flurry of discussion online—much of it focused on the recent controversy over the photo-sharing service's Terms of Use. The New York Post, for example, blamed the dip on a 'revolt' among Instagram users incensed over changes in the Terms of Use, including new legalese that some interpreted as blanket permission for the service to start selling user photos to advertisers. But a new statement from AppData, which tracks app traffic, suggests there's another cause behind the dip in daily active users: the season. 'The decline in Facebook-connected daily active users began closer to Christmas, not immediately after the proposed policy changes,' read a statement the firm sent to The Wall Street Journal. 'The drop between Dec. 24 and 25 seems likely to be related to the holiday, during which time people are traveling and otherwise have different routines than usual.'" It's also possible (likely, even) that there's no loss of users at all. AppData only checks a subset of Instagram users, and the photo-sharing site itself has said the data represented there is not accurate. Another article points out that several other Facebook-related services showed significant drops, according to AppData, which could suggests a problem with the entire platform or with the data gathering methods.

Share on Google+

Read more of this story at Slashdot.



A Peek Into the Business Side of Online Publishing (Video)
Dec 28th 2012, 20:35

Mark Westlake is the Chief Revenue Office for TechMediaNetwork. Slashdot has often taken a mediawatch role, especially when it comes to technology coverage -- which is what TechMediaNetork does for a living. As Chief Revenue Office, Mark is in charge of making sure enough money comes in to pay writers and editors, pay for bandwidth and servers, and hopefully have enough revenue over and above expenses to show a profit. We've interviewed editors and writers, and plenty of writers' work gets linked from Slashdot, but we pay little or no (mostly no) attention to the business side of the publishing business. Like it or not, if we are going to have online news someone has to sell the ads and make decisions about whether to set up a paywall or not. That's Mark's job. Like him or not, he does a job somebody has to do, and has been doing it for 30 years. He knows he's talking to a potentially hostile audience here, but he accepts that. As he says, near the end of the video, "...you can't please everybody, right?"

Share on Google+

Read more of this story at Slashdot.



What Turned VR Pioneer Jaron Lanier Against the Web
Dec 28th 2012, 19:46

i_want_you_to_throw_ writes "Details of Jaron Lanier's crusade against Web 2.0 continue in an article at Smithsonian Magazine. The article expands upon Lanier's criticism of Web 2.0. It's an interesting read, with Lanier suggesting we are outsourcing ourselves into insignificant advertising-fodder and making an audacious connection between techno-utopianism, the rise of the machines and the Great Recession. From the article: 'As far back as the turn of the century, he singled out one standout aspect of the new web culture—the acceptance, the welcoming of anonymous commenters on websites—as a danger to political discourse and the polity itself. At the time, this objection seemed a bit extreme. But he saw anonymity as a poison seed. The way it didn’t hide, but, in fact, brandished the ugliness of human nature beneath the anonymous screen-name masks. An enabling and foreshadowing of mob rule, not a growth of democracy, but an accretion of tribalism. ... 'This is the thing that continues to scare me. You see in history the capacity of people to congeal—like social lasers of cruelty. That capacity is constant. ... We have economic fear combined with everybody joined together on these instant twitchy social networks which are designed to create mass action. What does it sound like to you? It sounds to me like the prequel to potential social catastrophe. I’d rather take the risk of being wrong than not be talking about that.'"

Share on Google+

Read more of this story at Slashdot.



What Turned VR Pioneer Jaron Lanier Against the Web
Dec 28th 2012, 19:46

i_want_you_to_throw_ writes "Details of Jaron Lanier's crusade against Web 2.0 continue in an article at Smithsonian Magazine. The article expands upon Lanier's criticism of Web 2.0. It's an interesting read, with Lanier suggesting we are outsourcing ourselves into insignificant advertising-fodder and making an audacious connection between techno-utopianism, the rise of the machines and the Great Recession. From the article: 'As far back as the turn of the century, he singled out one standout aspect of the new web culture—the acceptance, the welcoming of anonymous commenters on websites—as a danger to political discourse and the polity itself. At the time, this objection seemed a bit extreme. But he saw anonymity as a poison seed. The way it didn’t hide, but, in fact, brandished the ugliness of human nature beneath the anonymous screen-name masks. An enabling and foreshadowing of mob rule, not a growth of democracy, but an accretion of tribalism. ... 'This is the thing that continues to scare me. You see in history the capacity of people to congeal—like social lasers of cruelty. That capacity is constant. ... We have economic fear combined with everybody joined together on these instant twitchy social networks which are designed to create mass action. What does it sound like to you? It sounds to me like the prequel to potential social catastrophe. I’d rather take the risk of being wrong than not be talking about that.'"

Share on Google+

Read more of this story at Slashdot.

Senate Renews Warrantless Eavesdropping Act
Dec 28th 2012, 18:55

New submitter electron sponge writes "On Friday morning, the Senate renewed the FISA Amendments Act (PDF), which allows for warrantless electronic eavesdropping, for an additional five years. The act, which was originally passed by Congress in 2008, allows law enforcement agencies to access private communications as long as one participant in the communications could reasonably be believed to be outside the United States. This law has been the subject of a federal lawsuit, and was argued before the Supreme Court recently. 'The legislation does not require the government to identify the target or facility to be monitored. It can begin surveillance a week before making the request, and the surveillance can continue during the appeals process if, in a rare case, the secret FISA court rejects the surveillance application. The court’s rulings are not public.'" The EFF points out that the Senate was finally forced to debate the bill, but the proposed amendments that would have improved it were rejected.

Share on Google+

Read more of this story at Slashdot.



Pirate Radio Station In Florida Jams Automotive Electronics
Dec 28th 2012, 18:05

New submitter titanium93 writes "For months, dozens of people could not use their keyless entry systems to unlock or start their cars when parked in the vicinity of the eight-story Regents bank building in Hollywood, FL. Once the cars were towed to the dealership for repair, the problem went away. The problem resolved itself when police found equipment on the bank's roof that was broadcasting a bootleg radio station. A detective and an FCC agent found the equipment hidden underneath an air conditioning chiller. The man who set up the station has not been found, but he faces felony charges and fines of at least $10,000 if he is caught. The radio station was broadcasting Caribbean music around the clock on 104.7 FM."

Share on Google+

Read more of this story at Slashdot.



Ask Slashdot: Linux-Friendly Motherboard Manufacturers?
Dec 28th 2012, 17:14

dotancohen writes "I am tasked with building a few Linux machines for a small office. However, many the currently available motherboards seem to be Linux-hostile. For instance, in addition to the whole UEFI issue, my last install was a three-day affair due to the motherboard reporting a Linux-supported ethernet device (the common RTL8168) while it was actually using a GbE Ethernet device that does not work with the legacy drivers and didn't even work with a test Windows 7 install until the driver disk was installed. There are no current hardware compatibility lists for Debian or Ubuntu and I've received from Asus and Gigabyte the expected reply: No official Linux support, install Windows for best experience. I even turned to the two large local computer vendors, asking if they could provide Linux-compatible machines ready to go, but neither of them would be of any help. What globally-available motherboards or motherboard manufacturers can you recommend today?"

Share on Google+

Read more of this story at Slashdot.



McAfee Labs Predicts Decline of Anonymous
Dec 28th 2012, 16:26

Every years, McAfee Labs produces a list of predictions relating to computer security for the next 12 months. Last year (PDF) they said Anonymous would have to reinvent itself, and that there would be an overall increase in online hacktivism. This year's report (PDF) is not as optimistic for the hacking collective. "Too many uncoordinated and unclear operations have been detrimental to its reputation. Added to this, the disinformation, false claims, and pure hacking actions will lead to the movement’s being less politically visible than in the past. Because Anonymous’ level of technical sophistication has stagnated and its tactics are better understood by its potential victims, the group’s level of success will decline." That's not to say they think hacktivism itself is on the decline, though: "Meanwhile, patriot groups self-organized into cyberarmies and spreading their extremist views will flourish. Up to now their efforts have had little impact (generally defacement of websites or DDoS for a very short period), but their actions will improve in sophistication and aggressiveness." The report also predicts that malware kits will lead to an "explosion in malware" for OS X and mobile, but that Windows 8 will be the next big target.

Share on Google+

Read more of this story at Slashdot.



Ouya Dev Consoles Ship, SDK Released
Dec 28th 2012, 15:33

An anonymous reader writes "Earlier this year, the Android-based Ouya game console project raised over nine times as much funding as they initially asked for in their Kickstarter campaign. Now, Ouya developer consoles are starting to ship, and folks on the Ouya team released a video showing what the developers should expect. As explained in the video, the console currently being shipped is by no means the final hardware, but promises to give developers everything they need to start developing apps and games for Ouya. The only surprise is that they decided to add a micro-USB port to the hardware, making it easy to hook up to a PC. The Ouya team has also released an SDK for the device (which they call the ODK — Ouya Development Kit), and have provided most of the source under the Apache 2.0 license. They wrote, 'We think we’ve got a great team of developers here at OUYA, but there’s strength in numbers and a wealth of passionate, talented people out there. We want you, the developers of the world, to work alongside us to continually improve our platform. It’s our hope that releasing a more open ODK will help foster such innovation.'"

Share on Google+

Read more of this story at Slashdot.



China Tightens Internet Restrictions
Dec 28th 2012, 14:45

The NY Times reports China has once again stepped up its efforts to control the internet, passing a new set of rules by which internet users and ISPs must abide. In addition to requiring that users provide their real names to internet providers, the government says those providers are now more responsible for deleting or blocking posts that aren't agreeable to the Chinese authorities. Quoting: "The new regulations, issued by the Standing Committee of the National People’s Congress, allow Internet users to continue to adopt pseudonyms for their online postings, but only if they first provide their real names to service providers, a measure that could chill some of the vibrant discourse on the country’s Twitter-like microblogs. The authorities periodically detain and even jail Internet users for politically sensitive comments, such as calls for a multiparty democracy or allegations of impropriety by local officials. In recent weeks, Internet users in China have exposed a series of sexual and financial scandals that have led to the resignations or dismissals of at least 10 local officials. International news media have also published a series of reports in recent months on the accumulation of wealth by the family members of China’s leaders, and some Web sites carrying such reports ... have been assiduously blocked, while Internet comments about them have been swiftly deleted. The regulations issued Friday build on a series of similar administrative guidelines and municipal rules issued over the past year. China’s mostly private Internet service providers have been slow to comply with them, fearing the reactions of their customers. The Standing Committee’s decision has much greater legal force, and puts far more pressure on Chinese Internet providers to comply more quickly and more comprehensively, Internet specialists said."

Share on Google+

Read more of this story at Slashdot.



NASA's Ion Thruster Sets Continuous Operation Record
Dec 28th 2012, 13:54

cylonlover writes "NASA's Evolutionary Xenon Thruster (NEXT) ion engine has set a new world record by clocking 43,000 hours of continuous operation at NASA's Glenn Research Center's Electric Propulsion Laboratory. The seven-kilowatt thruster is intended to propel future NASA deep space probes on missions where chemical rockets aren't a practical option. The NEXT is one of NASA's latest generation of engines. With a power output of seven kilowatts, it's over twice as powerful as the ones used aboard the unmanned Dawn space probe, yet it is simpler in design, lighter and more efficient, and is also designed for very high endurance. Its current record of 43,000 hours is the equivalent of nearly five years of continuous operation while consuming only 770 kg (1697.5 lbs) of xenon propellant. The NEXT engine (PDF) would provide 30 million newton-seconds of total impulse to a spacecraft. What this means in simple terms is that the NEXT engine can make a spacecraft go (eventually) very far and very fast."

Share on Google+

Read more of this story at Slashdot.



Backup Strategy: “My Hard Drive Crashed…” (And What I Learned From It)
Dec 28th 2012, 12:04


  

The most valuable part of a computer is also its most fragile: Data are the wealth of a digital lifestyle, a currency of which many notes are irreplaceable. At least, that's how I felt staring at a "Confirm you want to wipe your hard disk" message, my finger poised over the mouse.

During an emergency is a bad time to plan for one. It's the feeling one might get jumping from a plane before checking one's parachute. That's one experience I'd rather avoid, but it happened. Not the skydiving part. My OS was dying, and I wasn't prepared.

People who make websites face a triple threat: Live websites need backups; test environments need backups, especially when they double as backups for live websites. Subversion and Git provide safety nets in case of data loss. But there are also support files: Photoshop files, fonts, reusable jQuery snippets — not to mention music collections, an essential part of many creative processes.

I kept regular backups of many files using Apple's Time Machine. But "many" is not "all," and just then my Mac was too erratic for me to tell which fraction I had missed. After copying vital files to a handful of spare hard drives, I took a breath, formatted the disc and reinstalled the OS.

An hour later I was dismayed to see how many files I'd failed to back up. Photoshop files, local test websites, PDFs and most text files were safe. But passwords saved in the OS, cached emails, FTP bookmarks, application preferences and serial numbers, browser history, plugins, color swatches, copies of old browsers for testing… Gone.

Anything digital is susceptible to loss. For a recovering digital packrat like myself, who lives (and now dies) by the Web, data loss is a disaster akin to a tornado, which may also destroy backups kept in the same office as the original files. Fire, theft, spilled coffee, overwritten files, disgruntled coworkers, zombie attacks — I played out nightmare scenarios in my head. Then I began to research better ways to safeguard my digital life.

Two Safeguard Services

Offsite backup systems help with disaster recovery by storing versions of files in secure facilities. Many services exist, but I compared the $50-per-year Backblaze service to $50-per-year CrashPlan+ Unlimited package. I'd read reviews of both before and often saw them compared against each other. Which was better? I wanted to find out.

My test environment was a 2011 MacBook Air running OS X 10.8.2. I tested backups from four different locations over a period of three weeks. Important note: While the online backup services I tested should also work on Windows machines, I didn't have access to a computer with Windows OS on which to test. Anyone with backup experience on Windows or with services other than Backblaze and CrashPlan is welcome to share their experience in the comments.

Screenshot of DaisyDisc's colorful diagram of my hard drive

After two weeks of backups, I discovered a way to prioritize data. Using DaisyDisc, I determined that the largest folder — my music — was also the most replaceable. I instructed Backblaze and CrashPlan to avoid music for seven days, forcing them to focus on documents and Web files. Once I saw that my vital websites and support files were in both services, I felt safe enough to let them archive my music.

To expedite the backups and placate my neighbors, I also backed up overnight for two weeks, using Caffeine to keep my Mac awake and calibrating my battery after 10 days of continuous nightly charges.

Starting Backblaze

Backblaze customers may install an application for OS X (10.5 and up) or Windows (XP, Vista and 7). On OS X, Backblaze is accessible as a System Preferences pane. Whether Mac or PC, while a user's computer is on and connected to the Internet, Backblaze uploads the contents of the hard drive to a custom facility in San Francisco. The app was tiny, occupying 829 KB on my hard drive and using 10 MB of RAM.

After a quick hard-drive scan, Backblaze determined I had 92.7 GB of files ready to be backed up, mostly in my User folder. As advertised, it proceeded to copy my files any time my Mac had access to Wi-Fi.

The nature of Backblaze is set and forget, which I often did. Aside from its menu bar icon and an occasional network lag, the application did not draw my attention. I could view my archive through Backblaze's website the day after installation.

Starting CrashPlan

Like Backblaze, CrashPlan uploads selected sections of a hard drive to the service's facility in seven cities around the world. It also works in the background, avoiding notice unless summoned. To that end, CrashPlan for OS X installs two applications: one program accessible from its menu item, and a second full program that provides statistics, account information and options to store data on devices other than CrashPlan's servers. The menu bar shows how the current backup is proceeding, the current file being uploaded and pausing options.

Upon first launching, CrashPlan informed me that 101.6 GB was ready to be uploaded. I wasn't able to determine why it saw more files than Backblaze. Backblaze claims it does not upload podcasts, but that didn't account for 10 GB.

Comparative Features

From a casual glance, Backblaze and CrashPlan are similar enough to spawn their own patent war. Both systems share the same goal: to provide peace of mind by saving customers' data on remote servers. Both systems work the same way, copying files via the Internet while the user works. But they differ in more than options, prices and controls. The more I looked, the more their differences became apparent.

Pause

Both Backblaze and CrashPlan feature pause buttons, the need for which I discovered after three days.

Editing a WordPress theme on a live website, I stepped away for a cup of liquid enthusiasm while my FTP program connected to the server. When I came back, it was still connecting. Even normally fast websites took up to 40 seconds to load. This was the kind of slow usually reserved for Murphy's Law. (True to form, it was an emergency PHP error on deadline.) The culprit was running both backup services.

Backblaze's drop-down menu controls

Users may set Backblaze to run once per day or manually, but the application defaults to "continuous." Continuous backups may be paused until told to resume. It also resumes upon log-in — say, after a reboot.

Whereas Backblaze stays paused until told otherwise, CrashPlan users may pause with a timer for 1, 2, 4, 8, 12 or 24 hours. CrashPlan's "resume later" mentality was handy when, focused on my work, I would forget to reenable it. That cut both ways. More than once, I noticed a sudden Internet slowdown as CrashPlan reactivated. Backblaze gave me no such surprises.

CrashPlan's drop-down menu controls

Both services stopped running when I put my Mac to sleep, switched users or otherwise logged out of my user account. However, both services can back up the entire contents of a computer's hard drive — including user accounts not set to run the services. If someone else uses your Mac and you need access to their account, Backblaze and CrashPlan are happy to oblige.

Bandwidth

Although both applications stay out of the way, the fact that I was moving about 200 GB across the Internet did not go unnoticed. Neither backup service caused my Mac to slow down, but both I and people on the local network saw Internet speeds take a hit any time I had one or both running.

Slowdown was especially apparent as I was editing live websites. Websites at Rackspace, Dreamhost and Pair became tedious with either service running. Even the act of refreshing directories with Transmit disrupted my train of thought. But were the problems consistent? And which service caused more lag?

To measure their impact on Internet access, I uploaded the same files, totaling 1.5 MB, to a remote Web server via FTP from different locations and different times of day.

One result was obvious. Locations with more than 10 laptops, smartphones and iPads jostling for bandwidth saw varied results. Especially because each test took between 5 and 15 minutes, demand for local Wi-Fi changed as people came and went. But running either Backblaze or CrashPlan always slowed my FTP upload and download time, regardless of the crowd.

Infographic of which service reduced my FTP speeds

The graph above shows the time it took to upload files to a website while either CrashPlan or Backblaze was running. Seven times, uploads were faster while CrashPlan was running than when Backblaze was. Four times, the opposite was true. Three times were too close to call. One time, uploading to FTP while Backblaze was running was faster than uploading with Backblaze off, a testament to erratic bandwidth usage at coffee shops.

Anecdotally, I learned to stop both services while working on live websites (transferring WordPress from local to live servers) or while otherwise performing network-intensive tasks. The blue shows upload times with neither Backblaze nor CrashPlan running. Not only was it faster in 13 of 14 tests, but turning off both services doubled my FTP speed.

At least, it felt like a speed boost. By chance, during one test I also updated software on my iPad via Wi-Fi. I thought the updates had stalled until I disabled the backup services.

Recovery

Backup is half the story. Getting files back is the other. Hopefully, few people will be parted from their data. But if the worst happens, customers of both Backblaze and CrashPlan can recover files via the Web. Customers can also use CrashPlan's application and iPhone app to recover files — at least, on paper.

Backblaze's Straightforward Process

Backblaze's secure website lets users browse their files as they would their computer's file structure. After I selected random HTML, CSS, JPG and PNG files from different folders for recovery, an email notified me that my files were ready on the "My restores" page. The selected files downloaded as a single ZIP file. Customers may also request their files to be sent on a USB jump drive for $99 or on a hard disk for $189.

Recovering files on Backblaze's website

CrashPlan's Many Options

The full CrashPlan application allowed me to select backed-up files to download to a destination folder on my Mac. Although CrashPlan warned me that it was "unable to restore until we have synchronized with the destination," my files downloaded with no fuss. But I made the mistake of choosing my already-crowded desktop to receive a dozen recovered files. This differed from Backblaze, which downloaded one ZIP file that contained one folder.

Recovering files in CrashPlan's application

Users may also restore files from CrashPlan's Web application, although my experience indicated otherwise. While I had no trouble accessing my online account, five times over one week the website informed me it was "unable to log into server" to recover files. CrashPlan's tech support said recovery via the Web was a known issue but had no estimate of when the problem would be fixed.

I never had problems recovering files from CrashPlan's app for OS X. The iPhone app was a different story.

Backup to iPhone

CrashPlan's iOS app (v1.3.2) allows users to browse and download their saved files to their iPhones. As with the OS X application, I navigated a copy of my Mac's file structure by tapping the appropriate folders.

Tapping a single JavaScript, CSS, TXT or HTML document downloaded a copy of that file to my phone, after which I could read it as a text file in the CrashPlan app. The app let me copy text, email the file or open in other text-savvy apps, such as Evernote.

Rather than show source code, the app displayed HTML files as Web pages, including CSS styling, right down to clickable email links that opened my iPhone's Mail app. Thus, I couldn't tell, for example, whether a given HTML file had Google Analytics installed. PHP, JavaScript and CSS were legible as unformatted source code.

Recovering files in CrashPlan's iPhone app

For JPG and PNG images, the CrashPlan app offered to email, open in image-savvy apps or save to my photo roll. Files that it could not read — such as Pixelmator's PXM files — were displayed as an icon, along with the file size and a "Send by email" option.

The app had two significant limitations. First, I could recover only one file at a time. Granted, an iPhone is not the ideal destination to recover hundreds of files. But if you want to restore whole websites, be prepared for a lot of tapping.

Secondly, a week after I recovered a few files with the iPhone app, some of those files did not appear on my phone. Although I downloaded a WordPress plugin, for example, according to the app, my "Downloads" folder was empty.

Backblaze has no iOS app — yet. At the time of writing, the company has only hinted that it is developing an app.

Options Abound

The services covered here aren't the only two that sell peace of mind. Services such as Mozy and Carbonite follow the same approach: back up all of a customer's data to a remote, secure location. But design agencies and Web developers with dedicated computers might find advantages in selectively saving.

Mozy's product page

LayerVault

Aimed at creative users, LayerVault targets Adobe CS files, with an appropriately visual user interface. Its service includes full-sized previews of changes over time, collaborative change tracking and simple editing tools.

LayerVault's tour page

Dropbox

Dropbox keeps more than the files that people entrust to it; it keeps versions and deleted files for 30 days as well.

Dropbox's versions page

Evernote

This free note-taking software captures any type of information users throw at it. While Evernote won't back up files on one's hard drive, it is a good repository for HTML, CSS and JavaScript snippets and commonly used JPG and PNG files. Evernote notebooks can be shared, enabling a Web design team to collaborate on the same library.

Evernote's application

Backupify

While Google Drive (née Google Docs) resides in a secure facility, Google states that deleted files are gone forever. Backupify protects Google Drive and Gmail from users' accidental deletions.

Backupify screenshot

Backup Buddy for WordPress

Designers who use WordPress have many options to save their websites, both local and remote. Starting at $75, Backup Buddy preserves and restores both the files and database of a WordPress website. If you have the files or don't have the budget, try a database-only solution, such as WP DB Manager.

Backup and Migrate for Drupal

Designers on Drupal can use the Backup and Migrate module, which does what it says. Drupal also recommends other backup strategies to save a website.

Choosing a Service

No one strategy applies to every user's needs. But there's enough overlap between Backblaze and CrashPlan to give would-be customers pause.

Ostensibly less developed, Backblaze offers fewer options, making for a streamlined service. One pricing plan and one application means that new customers can begin saving their data minutes after reading the sales pitch. Its business package follows the same prices but combines billing and data management for many computers into one account. The price per month decreases as customers buy more time: $95 for two years, $50 for one year and $5 per month.

CrashPlan plays to different digital needs with the following:

Both services worked well in my tests. While Backblaze delivers what it promises and no more, CrashPlan struck me as a business that targets specific markets. After using both for several weeks, I found Backblaze sufficient for my work. Your experience will vary, of course, and at the time of writing I have another 49 weeks with both. Which will I prefer this time next year? Time will tell.

Either service works better than doing nothing, as I discovered when I wiped my hard drive back to factory settings. Backblaze's motto could speak for every service: "Back up, before you wish you had."

(al)


© Ben Gremillion for Smashing Magazine, 2012.

Shapeshift: Inspired by jQuery Masonry with Drag & Drop
Dec 28th 2012, 07:33

Advertise here via BSA

Shapeshift is a plugin which will dynamically arrange a collection of elements into a grid in their parent container. Shapeshift is intended to be a very bare bones version of these grid systems, however the drag and drop is what sets it apart from the other similar plugins. Position any item within the grid by dragging and dropping them into place. Shapeshift will try to find a logical place for it and display that to you.

Resizing the grid to accommodate for more or less space is automatically turned on in Shapeshift, so if your parent container has a 100% grid resizing the window will shapeshift the child objects around to accommodate for the new layout. You can even set CSS3 media queries on your objects and watch as they get shapeshifted around their new size.

shapeshift

Requirements: jQuery Framework
Demo: http://mcpants.github.com/jquery.shapeshift/
License: MIT License

Sponsors

Professional Web Icons for Your Websites and Applications

How to Create a Windows 8 RSS Reader App with HTML5
Dec 28th 2012, 07:01

Advertise here via BSA

Starting from scratch, we're going to learn through these 2 tutorials how to build a small RSS reader with HTML5, CSS3 and WinJS, the Microsoft JavaScript framework for Windows 8. We will then build a WinRT application targeting the Windows Store. We'll try also to follow the Windows 8 UI design guidelines by using Expression Blend 5. If everything goes fine, you should be able to follow these 2 articles in 30 minutes.

windows-8-apps

This first article will help you to create the welcome screen that will use a WinJS ListView control. This control will display all the blog posts recently published via nice thumbnails. The 2nd one will work on the detail view displayed when you'll click on one of the items. At last, you'll find at the end the final solution to download. See it as useful complementary resources if you need to clarify some parts of this article.

Pre-requisites: to follow these tutorials, you need first to:

1 – Download/buy & install Windows 8 RTM on your machine:  where you also have a 90-day trial version available. 2 – Download & install Visual Studio 2012 Express RTM for Windows 8:  which is free or you can use the paid versions of course.

Note: If you've got a Mac, it works perfectly fine thanks to BootCamp or inside a virtual machine handled by Parallels for instance

Note 2: this article has been updated on 21/08/2012 to implement the changes in the UI & in the code between Windows 8 Release Preview and RTM. In a general manner, if you need to migrate your application from RP, you should read this document: breaking changes document. In our case, the only impact was linked to the new UI & naming of Visual Studio.

Note 3: I've added a complementary post dedicated to WordPress and Community Server here: Windows 8 HTML5 Metro Style App: RSS reader in 30min – building your WordPress version

Here is a brief summary of what we're going to see in this article:

  • Step 1: creating a blank application
  • Step 2: creating the HTML & CSS base of our main page
  • Step 3: first contact with Blend
  • Step 4: loading the data with XHR and bind them to the ListView control
  • Step 5: using a template and modifying the design with Blend
  • Step 6: source code to download Note: these tutorials are based on the Tools for building Metro style apps session of the BUILD delivered by Chris Sell & Kieran Mockford. I've simply updated it for Windows 8 RTM.

Note: these tutorials are based on the Tools for building Metro style apps session of the BUILD delivered by Chris Sell & Kieran Mockford. I've simply updated it for Windows 8 RTM.

Step 1: creating a blank application

First thing you need to do is to launch Visual Studio 2012 and create a new JavaScript –> Windows Store Blank App project via "File –> New Project":

Name it "SimpleChannel9Reader " as we're going to download the RSS stream of the Coding4Fun section of Channel9 available here: http://channel9.msdn.com/coding4fun/articles/RSS

Step 2: creating the HTML & CSS base of our main page

Open the "default.html" file which describes the first page that will be displayed when you'll launch the application. Instead of the following HTML part:

<p>Content goes here</p>

Insert this one:

<div id="main"> <header id="banner"> <button id="backbutton" class="win-backbutton"> </button> <h1 id="maintitle" class="win-title">Welcome to Channel 9!</h1> </header> <section id="content"> </section> </div>

We now have a global div container with the "main" id embedding 2 sub-containers named "banner" and "content". The header will be obviously displayed at the top of the page and the content section will be displayed just below.

Let's add a bit of CSS to that by opening the "default.css" file stored in the "css" directory. You'll see that there is already some predefined CSS to handle the various available Windows 8 views thanks to Media Queries.

In these 2 articles, we will concentrate our efforts only on the "fullscreen-landscape" state. So jump into this section and insert the following piece of CSS:

#main {    width: 100%;    height: 100%;    }    #banner {    width: 100%;    height: 100%;    }    #backbutton {    }    #maintitle {    }    #content {    width: 100%;    height: 100%;    }

This simply indicates that we'd like to take all the available space for our 3 main containers.

Run your application by pressing the F5 key or by clicking on the following button:

Logically, you should now see this screen:

And you should also see an obvious design problem: the back button and the title are not aligned. Let's resolve this by using Blend 5!

Step 3: first contact with Blend

Launch Blend and navigate to the folder where your SimpleChannel9Reader project is. Blend will then show that:

The goal here is to create 2 grids. The first one will be for the main container. It will be defined by 1 column that will take all the available width & by 2 lines. The 2nd one will be defined by 1 line & 2 columns for the back button and the title.

Let's start by the select the "main" element by using the "Live DOM" window:

Jump to the "CSS Properties" part, select the #main rule and in the "Layout" window, switch the display to "-ms-grid":

We're going to use the CSS Grid Layout specification currently only supported by IE10 but that should soon land in other browser. If you'd like to know more about the type of layout supported in the Metro mode, you can read this article: Choosing a CSS3 layout for your app.

If you simply want to discover the CSS3 Grid specification, feel free to play with this IE Test Drive demo: Hands On: CSS3 Grid Layout

Ok, now that the display is properly switched into grid, we need to define our grid. For that, jump to the "Grid" part and declare the following properties:

We will have a unique column spanning through the complete width of the screen (whatever the resolution will be) and 2 lines. The first line will have a "fixed" height size of 132px and the other one will take the remaining space. You can see this inside the Blend designer surface:

Now, we're going to move the "content" element into the 2nd line. Select it into the "Live DOM", choose the #content rule and move to its "Grid" properties. Change the "-ms-grid-row" value to 2. You can also move the "banner" element to the row 1 but it will be there by default otherwise.

We're now going to split our first line into 2 columns in order to move each element in the right places. Select the "banner" element, switch its display property to "-ms-grid" and define 1fr line & 2 columns of 120px and 1fr:

Move the "maintitle" element into column 2 and center it vertically thanks to the "-ms-grid-row-align" property changed to "center":

Select the "backbutton" and jump to the "Layout" part. Set a 54px top margin and a 40px left margin. If you haven't missed something, you should now see that on the design surface:

Save all changes via "File" -> "Save All" and go back to Visual Studio. Open "default.css" and you'll see that Blend has generated some "clean" CSS in the right rules:

@media screen and (-ms-view-state: fullscreen-landscape)    {    #main {    width: 100%;    height: 100%;    display: -ms-grid;    -ms-grid-columns: 1fr;    -ms-grid-rows: 132px 1fr;    }    #banner {    width: 100%;    height: 100%;    display: -ms-grid;    -ms-grid-columns: 120px 1fr;    -ms-grid-rows: 1fr;    }    #backbutton {    margin-top: 54px;    margin-left: 40px;    }    #maintitle {    -ms-grid-column: 2;    -ms-grid-row-align: center;    }    #content {    width: 100%;    height: 100%;    -ms-grid-row: 2;    }    }

Simply check that the application works fine by pressing F5.

Step 4: loading the data with XHR and bind them to the ListView control

Ok, let's now dig a little bit into the code.

First thing to do is to insert the control that will be in charge of displaying our articles' thumbnails on the welcome screen. We're going to use WinJS for that.

The WinJS library or "Microsoft Window Library for JavaScript SDK" is made to help the JavaScript developers implementing the new Windows 8 UI experience in an easy way. It provides a set of controls, a templating engine, a binding engine, Promises to handle the asynchronous calls, helpers to handle Namespaces, etc.

In Windows Store projects, you'll find this library in the references section of the "Solution Explorer":

Inside, you'll find the default style sheets with the 2 dark & light themes provided as well as the JavaScript code. Feel free to have a look to it. It could be interesting to learn by reading the code.

In our case, we're going to use the ListView control which creates a grid layout to display the list of elements.

Open "default.html" and inside the section tag, type this piece of HTML:

<div id="articlelist" data-win-control="WinJS.UI.ListView"></div>

Currently, it's only a simple classical div. However, it's annotated with the data-win-control attribute which indicates that we'd like to use the WinJS library to transform this simple div into a JavaScript ListView control.

This operation is done thanks to a magical line of JavaScript code you'll find into "default.js". Here it is:

WinJS.UI.processAll();

This asynchronous operation will parse the DOM to find all the elements tagged with "data-win-control" attributes to transform them into real WinJS controls implementing the new Windows 8 UI experience for you. If you remove this line by mistake, all your div will become again some simple div.

We now need to feed this ListView with some data grabbed from the RSS feed. In the function bind to the "onactivated" event, add this piece of code just above the processAll() line:

articlesList = new WinJS.Binding.List();    var publicMembers = { ItemList: articlesList };    WinJS.Namespace.define("C9Data", publicMembers);

You'll need then to declare the "articlesList" variable at the top of the function just below the "app" one for instance.

We're declaring here a Binding.List() type. It's the type to use to bind your data to the WinJS controls. It contains for instance some methods that will help you to add some data in background and thanks to the binding mechanism; it will be reflected into the view automatically.

Moreover, you may have noticed that we're using some clean JavaScript code by using modern patterns like the "module pattern" for instance. For that, we have an anonymous self-executed JS function into "default.js". We then need to find a way to expose some public data to external functions. That's why we're implementing the Namespace concept into the associated WinJS helper. It helps us to easily define what we'd like to expose. In our case, we will have a public "C9Data" object that will have an "ItemList" property containing our future elements to display.

We now need a function that'll grad the data from the RSS feed, parse them and create some JS objects on the fly to push them into the famous binding list. Here is mine:

function downloadC9BlogFeed() {    WinJS.xhr({ url: "http://channel9.msdn.com/coding4fun/articles/RSS" }).then(function (rss) {    });    }

This function starts by running an asynchronous XmlHttpRequest to the specified URL. The code defined into the Promise (into the .then() if you want) will then be executed only once the request will be finished and the data received. It's then here that we need to filter the data via this piece of code you have to insert into the anonymous function:

var items = rss.responseXML.querySelectorAll("item");    for (var n = 0; n < items.length; n++) {    var article = {};    article.title = items[n].querySelector("title").textContent;    var thumbs = items[n].querySelectorAll("thumbnail");    if (thumbs.length > 1) {    article.thumbnail = thumbs[1].attributes.getNamedItem("url").textContent;    article.content = items[n].textContent;    articlesList.push(article);    }    }

I hope that this code will be self-explicit. It selects the "item" nodes, access to their interesting properties to map them to an "article" object created on the fly on the "title", "thumbs" & "content" properties. Please keep in mind the name of those properties; we will use them later on. At last, this function finishes by adding this new object to the binding collection.

We now need to run this function during the starting phase of our application. This code should run once the DOM parsing will be done to build the WinJS controls. So, to do that, use this line of code:

WinJS.UI.processAll().then(downloadC9BlogFeed);

We have to specify to the control its data source. Jump back into the HTML code and modify the div associated to the ListView to change its options:

<div id="articlelist" data-win-control="WinJS.UI.ListView" data-win-options="{ itemDataSource: C9Data.ItemList.dataSource }"></div>

At last, we need some basic CSS to help the control to know how to draw each of its items. Jump to the "default.css" file and add these 2 rules:

#articlelist {    width: 100%;    height: 100%;    }    #articlelist .win-item {    width: 150px;    height: 150px;    }

This CSS indicates that our ListView control should take all the available space of its container and that each of its items (via the ".win-item" class") should take 150 by 150 pixels.

Run the solution by pressing F5. You'll have something as ugly as that:

But don't panic, this ugly output is the expected behavior! We still have a bit of design to work on. But you can already see that the binding works correctly and that the control works fine with touch' & mouse experiences. Moreover, the control automatically scales to the various resolutions. You'll then not have the exact layout (number of columns & lines displayed) that the above screen in your case.

Step 5: using a template and modifying the design with Blend

We now need to change the way each element is drawn. This is exactly the purpose of the templating engine. A template is only a piece of HTML marked with WinJS attributes.

Navigate to the "default.html" page and add this piece of HTML just above the "main" part:

<div id="C9ItemTemplate" data-win-control="WinJS.Binding.Template" style="display: none;"> <div class="listItemTemplate"> <div class="listItemImage"> <img data-win-bind="src: thumbnail" /> </div> <div class="listItemTitle" data-win-bind="innerText: title"> </div> </div> </div>

It's an HTML template marked with the "WinJS.Binding.Template" value. This will help WinJS to know what to do with this special piece of HTML after the processAll() execution. You'll find also the usage of "data-win-bind" to define binding expressions. It will help the binding engine to know which JavaScript properties from the data source to map to the appropriate HTML nodes. Except that, you can use some classic HTML.

We now need to configure the WinJS control to not use the default template anymore but to use the new one above instead. It's done by simply changing the options:

<div id="articlelist" data-win-control="WinJS.UI.ListView" data-win-options="{ itemDataSource: C9Data.ItemList.dataSource, itemTemplate: C9ItemTemplate }"> </div>

If you now run the application, you should have this screen:

It's better but we're not done yet. To go further in the design review, we need the help of our friend Blend.

So, let's go back into Blend. It will ask you to reload all the modifications you've done inside Visual Studio. Once done, you'll have that:

Aren't you surprised? You should! Indeed, we see here the same visual output you will have when you'll press F5 in Visual Studio. This means that Blend 5 is dynamically executing the JavaScript part of your application directly inside the designer! This is an awesome feature.

Thanks to that, you will be able to directly work on real data without being forced to put in place what we call "mocking". It's the cool part of JavaScript. Blend was able to execute the JS code that launch the XHR request and build the WinJS objects.

Under "default.css", let's add 2 new CSS rules. Click on the "+" button on the main media query:

And add these new selectors:

.listItemTemplate and .listItemTemplate img

Select the #articlelist .win-item rule that will highlight each element of the ListView control with the "articlelist" ID.

Change the size of these elements to go from 150px by 150px to 250px by 250px. You simply need to jump into the "Sizing" part of the right panel.

The layout should be updated dynamically. If not, you can force a refresh of the design surface by clicking on the dedicated button:

And here is the result you should have:

We're now going to resize the template's images. For that, use the "Selection" pointer and click on one of the images:

You can see the current applied CSS rules in the "Applied Rules" section. Click on ".listItemTemplate img" and resize with your mouse the image you've just selected. All the other images matching the same selector will then dynamically reflect the changes in real time.

Rather than searching the appropriate size, I'm going to help you. Jump into the "Sizing" section and fix the following size: 234px width and 165px height.

To enhance a bit more our design, we need some more space between each element and to align our ListView control with the title.

Click on the ".listItemTemplate" selector, navigate to the "Layout" section and click on the "Lock" icon at the right of the "Margin" area. Select any margin and type 8px.

At last, to align the grid of the ListView control with the title, we need to move it from the left by 120px – 8px of the element margin we've just set.

Add a new selector by pressing on the "+" button and name it ".win-surface". Fix a left margin of 112px.

Go back to Visual Studio, accept the changes done and press F5. You should now have this kind of layout:

Step 6: source code to download

We've made good progress so far. We now need to display the detail of each article, to continue discovering the power of Blend as well as a couple of new cool CSS3 features. You can download the solution associated to this first article here: Simple Channel9 Reader Article1

About the Author

David Rousset is a Developer Evangelist at Microsoft, specializing in HTML5 and web development. Read his blog on MSDN or follow him @davrous on Twitter.

Sponsors

Professional Web Icons for Your Websites and Applications

Does your technolust transcend tech?
Dec 28th 2012, 01:21

My ever constant thirst for knowledge is often quenched by Wikipedia – at least as a jumping off point. Does yours? The fantastic Wiii Lecturer Android app may become a new favorite.

The post Does your technolust transcend tech? appeared first on Technolust since 2005.

Hak5 1219 – WiFi Packet Sniffing on Android and Ubuntu Backups
Dec 25th 2012, 19:00

This time on the show, packet captures on Android - Mike Kershaw, aka Drag0rn, joins us to talk about his rootless Android pcap app. Then, backing up Ubuntu the easy way. Plus, apt tips, sudo aliases and the return of blackbuntu? All that and more, this time on Hak5!

Download HD Download MP4

Android PCAP Capture
"

Android PCAP Capture is a utility for capturing raw 802.11 frames ("Monitor mode", or sometimes referred to as "Promiscuous mode"). The resulting Pcap files can be viewed on a computer using Eye P.A., Wireshark, Tcpdump and similar tools, or online using CloudShark.

Android PCAP works with Android phones running version 4 (ICS) or higher and Wi-Fi cards that use the RTL 8187 chipset.

Android PCAP Capture"

Back Up Your Ubuntu Machine

One of the most important tasks, backing up your machine, can be a pain if you don't know what to use or what would work best for your needs. Ask yourself what kind of backup do you need? Full or incremental? Where should you back up to? Local or cloud? When should you back up? Every day or every week, etc? Should you automate the process?
Backing up your Ubuntu Machine has now been made easy with a built in tool called Deja Dup. Available for Ubuntu 11.10 and higher.

Feedback

beardy_jesse says: I noticed when Darren was installing stuff like python-usb and rfcat a couple of episodes ago that when the "are you sure [Y,n]" option appeared he typed "Y" to continue. While this is sensible, the capitalised item in the brackets is the one that linux assumes you want, and will be performed simply by pressing enter. So instead of wasting a good half second reaching for the shift and another half second pressing 'y' you can plough through that option just by hitting enter.

Chris R writes: Have you ever considered aliasing Sudo to Please? Just food for thought.

John says: Blackbuntu is somewhat alive on SourceForge: http://sourceforge.net/projects/blackbuntu/ It's last update was 8.4.2011, but the page is still there. Thanks for your awesome podcast!!!! Love it!!! You are Darren get along so well, though I bet you get along with everyone anyway!

The post Hak5 1219 – WiFi Packet Sniffing on Android and Ubuntu Backups appeared first on Technolust since 2005.

HakTip 73 – Linux Terminal 101: Typing Less with Keyboard Shortcuts
Dec 21st 2012, 19:00

Today we're typing less with keyboard shortcuts!

Download HD Download MP4

Once you think you've learned the easiest way to input a command or control, there seems to always be a new way! My goal is to never need to use my mouse... but that may take some time. Here's a few tips that can help!

Moving your cursor around: Use CTRL-A, E, F, B to move the cursor to the beginning, end, forward one character, or backwards. Typing clear will clear out your terminal.

Edit the text in your command: Use CTRL-D, T to delete or exchange the character at your cursor's location. Use ALT-L, U to convert characters to lowercase, or convert characters to uppercase.

Cutting and pasting: This is also called 'killing and yanking'. Use CTRL-K, U to kill text from the cursor to the end of the line or from the beginning to the cursor. Use ALT-D, and backspace to Kill text from the cursor the the end of the word or from the cursor to the beginning of the word. Use CTRL-Y to take text that has been cut and insert it at the cursor location.

We're back with keyboard shortcuts! We've already gone over the tab completion in a previous HakTip, so we'll now check out some other commands. Instead of tab, you can also use ALT-? to show a list of completions you can use, and ALT-* to insert all of those possibilities.

Check out the bash man page under Readline to see more completion commands.

What are your favorite shortcuts for text completion? Make sure to email me tips@hak5.org with your thoughts. And be sure to check out our sister show, Hak5 for more great stuff just like this. Dont forget to check out our new show Threat Wire, for internet privacy and security news at youtube.com/techfeed. I'll be there, reminding you to trust your technolust.

The post HakTip 73 – Linux Terminal 101: Typing Less with Keyboard Shortcuts appeared first on Technolust since 2005.

iPad hacking with the USB Rubber Ducky
Dec 20th 2012, 10:34

www.usbrubberducky.com

The post iPad hacking with the USB Rubber Ducky appeared first on Technolust since 2005.

Shannon is broken
Dec 20th 2012, 09:45

The post Shannon is broken appeared first on Technolust since 2005.

Hak5 1218 – Extreme GPU Password Cracking and Home Theater PC Builds
Dec 19th 2012, 19:00

Cracking every standard Windows password in less than 6 hours with a massive GPU cluster, building a home theater PC for about $300 and blinkenlights. All that and more, this time on Hak5!

Download HD Download MP4

Jeremi Gosney's Massive Password Cracking GPU Cluster

I had a chance to talk to Jeremi Gosney about the latest advances in password cracking. Gosney, the CEO of Stricture Consulting Group, recently showed off his latest password cracking rig at the Passwords^12 conference in Norway. The rig, which uses 25 AMD Radeon graphics cards is able to bust every possible 8 character NTLM hash in about 5.5 hours. NTLM has been included in Windows since Server 2003 and replaces the considerably weaker LM hash (which is the password hash equivelent to WEP -- a joke). Gosney's rig is unique in that it uses VCL Virtualization to allow a single controller to communicate with multiple machines loaded with graphics cards. Using HashCat Plus the rig is able to make 350 billion attempts per second against NTLM, 63 billion per second against SHA1 and 180 billion per second against MD5. Bcrypt and SHA512crypt are "safer" for now at 71,000 and 364,000 attempts per second respectively. If you haven't already, go and make your password more complex - and for the love of God stop using the same one on every site.

The post Hak5 1218 – Extreme GPU Password Cracking and Home Theater PC Builds appeared first on Technolust since 2005.

Threat Wire 0004 – All your passwords are belong to us
Dec 14th 2012, 21:44

  • Tumblr vs GNAA
    • Tumblr users may have experienced some spam messages on their feeds produced by an anti-blogging campaign by a group called the GNAA- who believe blogs have lowered the standards of journalism.
    • The worm started on the "Brony" tag on Tumblr. When a user would repost the content with that tag, they would get spam on their account, and it grew from there.
    • Tumblr has removed the spam from the GNAA worm, but BuzzFeed advised users to stay on the Dashboard, don't follow any direct Tumblr links, and change their passwords.
    • The Tumblr staff blog confirms that “No accounts have been compromised” and that “you don’t need to take any further action”, but it is always a good practice to periodically change your password.
  • 25-GPU cluster cracks every standard Windows password in <6 hours
    • http://arstechnica.com/security/2012/12/25-gpu-cluster-cracks-every-standard-windows-password-in-6-hours/
    • Interview snippet with Jeremi Gosney
    • Gosney, the CEO of Stricture Consulting Group, recently showed off his latest password cracking rig at the Passwords^12 conference in Norway.
    • The rig, which uses 25 AMD Radeon graphics cards is able to bust every possible 8 character NTLM hash, the password hashes used by Windows, in about 5.5 hours.
    • For the most part passwords aren’t stored in a database in the clear because anyone with access to the database would have the password. Rather a one-way hash function is used – so if my password were “godlovesecretsex” it would be stored as a “hash”, or a seemingly random bunch of letters and numbers. When I type my password in it runs through the same “hash function” it’s stored in and if what I type and what’s stored match I’m granted access.
    • It’s called a one-way hash because it can’t be reversed. The only way to know that the gibberish stored in the database translates to “godlovesecretsex” is to try every possible combination until you find a match.
    • Attempting every possible 8 character combination of letters, numbers and symbols is seriously time consuming for the average computer — but Gosney’s rig, using the open-source HashCat program and 25 graphics cards, is able to make 350 billion guesses per second against the hash function used by Windows.
    • This password cracking cluster puts a serious dent in the hashes most commonly used to store passwords – not just windows. It’ll make crack other common functions like SHA1 and MD5 at around 63 billion and 180 billion attempts per second respectively.
    • So what does this mean to you? Well, if a web site is storing your password with one of these hash functions and the password database is leaked – like when 6 and a half million LinkedIn passwords were in June of year – your password could be discovered in a matter of hours.
    • Smart web site operators are now using “safer” hash functions, like Bcrypt and SHA512crypt. Breaking these hashes is about a million times slower.
    • Unfortunately there is no practical way to know if a web site uses a strong or weak hash function so the best advice is to use a long passphrase, something 15 characters or more, and use a different password for every site.
  • Richard Stallman calls Ubuntu "spyware" because it tracks searches
    • http://arstechnica.com/information-technology/2012/12/richard-stallman-calls-ubuntu-spyware-because-it-tracks-searches/
    • Richard Stallman, President of the Free Software Foundation called Ubuntu "spyware" because it sends search data to Canonical.
    • Stallman criticizes a new feature in Ubuntu’s Unity interface which makes product recommendations from Amazon when you use the Dash search feature. For example if you type “Calculator” you’d see both the application and calculators for sale from the retail giant.
    • The Electronic Frontier Foundation in October made a blog post underscoring the privacy invasiveness of the new feature while offering recommendations to disable it.
    • Canonical, the makers of Ubuntu, have since responded saying that they plan to expand the search function in the next version of the operating system. The upcoming Smart Scope feature will search local and online sources, add instant purchasing, and include more retailers.
    • Canonical still says your privacy is important and none of the data transmitted will user-identifiable. They also say that users will know what data is collected through the Dash and some data collection can be turned off in the settings.
  • ITU vs. the Free and Open Internet
  • Facebook Policy Changes
    • Facebook says they encouraged users to vote on the policy changes to their site, but that poll has ended, with about 700,000 voting- a far cry from the needed 30% for them to take the votes into consideration.
    • Wait, they encouraged users to vote? Where?
    • With this also comes some changes to privacy controls! A new privacy shortcut, specific application permissions, easier photo tag removal, and changes to the activity log. But, they are also removing the "Who can look up my timeline by name?" function- meaning if a person types your name into the search bar, they can see your whole timeline. Hopefully the new privacy shortcuts will give users a new way to re-implement a similar timeline privacy function, but that is yet to be determined. You still have the option of changing specific posts to private, though.
    • What do you do now? Votes don't matter, so either delete your account, or take precautionary measures to stay on top of Facebook's ever-changing privacy policies so you don't accidentally share more than you want.

The post Threat Wire 0004 – All your passwords are belong to us appeared first on Technolust since 2005.

Hak5 1217 – Hack any 4-digit Android PIN in 16 hours with a USB Rubber Ducky
Dec 12th 2012, 19:00

This time on the show, an online brute force attack against Android successfully defeats 4-digit PIN codes in about 16 hours using the USB Rubber Ducky without wiping user data. Plus, BackBox Linux - is this a pen-testing OS for every day use? All that and more, this time on Hak5.

Android Brute Force Attack with USB Rubber Ducky
"

Brute forcing Android PIN authentication with a USB Rubber Ducky. Thus far it works perfectly on a Galaxy Nexus running the latest Android 4.2.1. I've also tested it with a Galaxy Note 2 running 4.2.1 and it has run as expected.

I'm very surprised that with the stock Android OS and recommended settings of setting a PIN code this was possible. I had expected the phone to reset or format after 100 attempts or something like that.

With a 4 digit PIN and the default of 5 tries followed by a 30 second timeout you're looking at a best case scenario of exhausting the key space in about 16.6 hours. Not bad all things considered. If you're the NSA or the Mafia that's totally reasonable, I'd say. Thankfully the USB Rubber Ducky never gets tired, bored or has to pee.

Rather than post the nearly 600K duckyscript I'll just post the bit of bash I used to create it. You could modify it to do 5 digit, but that would take 166 hours. 10 digit would take 1902.2 years. ;-)

echo DELAY 5000 > android_brute-force_0000-9999.txt; echo {0000..9999} | xargs -n 1 echo STRING | sed '0~5 s/$/nWAIT/g' | sed '0~1 s/$/nDELAY 1000nENTERnENTER/g' | sed 's/WAIT/DELAY 5000nENTERnDELAY 5000nENTERnDELAY 5000nENTERnDELAY 5000nENTER/g' >> android_brute-force_0000-9999.txt

"

BackBox Linux

Daniel says: Have you heard of Back Box? It's an Ubuntu based OS with a focus on pen testing with an XFCE desktop. But, unlike backtrack it is actually functional as a day to day OS. I've been using it as my main OS for 5 months now and I truthfully believe it doesn't receive enough attention.

Download Back Box Linux

The post Hak5 1217 – Hack any 4-digit Android PIN in 16 hours with a USB Rubber Ducky appeared first on Technolust since 2005.

Threat Wire 0003 – ITU-T approves Deep Packet Inspection recommendation
Dec 7th 2012, 19:14

  • ITU-T makes recommendation for Deep Packet Inspection
    • While the Internet opposes the upcoming WCIT, or World Conference on International Telecommunications this December 3rd through the 14th in Dubai, a related conference ends bringing some chilling insight on what could be the future of the Internet.
    • As we talked about last week the ITU or International Telecommunication Union is a United Nations agency formed in 1865 made up of 193 nations to develop global communications interoperability.
    • The WCIT conference is where these nations will be voting on ITRs, or International Telecommunication Regulations; treaties which haven’t been updated since 1988.
    • Unlike governing bodies of the Internet like the Internet Engineering Task Force and Internet Architecture Board, ITU proposals aren’t public – there’s a complete lack of transparency. It’s only through sites like WCITLeaks.org that we know what’s even being proposed.
    • And like the Megazord the ITU has a few arms, including the ITU-T, or ITU Telecommunication Standardization Sector. (Hacker culture note: this agency spawned out of the CCITT – you know, the big blue book that doesn’t fit on a shelf)
    • The ITU-T holds a conference, called the WTSA or World Telecommunication Standardization Assembly, every four year to produce telecommunications standards, which they refer to as “Recommendations” – which eventually become mandatory when adopted as part of national law.
    • One such “Recommendation” raising concerns amongst privacy advocates was approved on November 20th. Regarded as working item Y.2770 this recommendation is titled “Requirements for deep packet inspection in Next Generation Networks”
    • Deep Packet Inspection, or DPI, is a networking practice whereby packets, the bits of Information that make up all of our communications on the Internet, are filtered, examined and stored at router end-points. Basically it can be used for data mining, eavesdropping and Internet censorship.
    • The approved Recommendation shows little respect for privacy including support for the inspection of encrypted traffic with “in case of a local availability of the used encryption key(s).” Why the government would have such key locally available is scary to say the least. The document goes to discuss IPSec, a popular Virtual Private Network technology, noting that “aspects related to application identification are for further study”. Great.
    • Furthermore the document states that information extracted by DPI “is required to be protected”. Wonderful – would this mean network operators would then be required to store this sensitive user data? That’s not a security risk at all. /sarcasm
    • The approval of this Deep Packet Inspection at the WTSA recommendation only underscores the urgency in opposing the upcoming closed door WCIT.
    • Get involved

The post Threat Wire 0003 – ITU-T approves Deep Packet Inspection recommendation appeared first on Technolust since 2005.

HakTip 72 – Linux Terminal 101: My Top Best Resources
Dec 7th 2012, 19:00

This week we are checking out some great resources to help you be an expert Linux user!

Download HD Download MP4

There are tons of websites and books out there that can help you learn how to use Linux, and you have sent in plenty as well. Here are my top websites and books to make you an expert Linux user in weeks.

CommandLineFu helps you learn complex commands and check out their popularity via the votes on the page. This site also lets you easily copy the command straight from the site.

I have a printout of this one sitting next to my desk at work in case I'm having a horrible brain fart. This PDF lists several common commands and what they do for quick reference! This one is a must have if you're just starting out.

Check out O'Reilly.net for a huge listing of commands that you can also find in their book- Linux in a Netshell. This won't give you an example so to speak, but will help with confusing command descriptions.

This blog post from Free Video Lectures has links to 5 different sites that are extremely helpful for learning commands. Many of these list the information in a table of contents, and help you understand not only the command name, but also why it works.

On to the books! The Linux Command Line is a wonderful book if you've never even touched the OS. William Shotts describes everything in a way that's easy to understand and use to your advantage. I couldn't recommend this book enough, as it has helped me immensely.

Ubuntu for Non-Geeks makes using not only the terminal, but also the GUI easy. This is a great book to check out if you've only used strictly Windows growing up or if you're converting from MAC. Grant teaches how to do simple tasks like keeping Ubuntu updated to, of course, the terminal.

How could I not include the Linux Pocket Guide? Everyone I know has one of these, even Darren (though you'd think he has everything memorized!)

Working at the Ubuntu Command Line is a great nitty gritty book, and super cheap, to get into your more advanced tutorials quickly.

Ubuntu Made Easy is another great book by Grant, that helps you do all those menial tasks like setting up printers and using programs that are available in Linux. This is another one I recommend a ton!

You guys have sent in some great tips and feedback, keep 'em coming!. Thanks to everyone that has been sending feedback for our Linux Terminal 101 series. I've really enjoyed

learning what I have and I hope you enjoyed these resources.

Make sure to email me tips@hak5.org with your thoughts. And be sure to check out our sister show, Hak5 for more great stuff just like this. I'll be there, reminding you to trust your technolust.

The post HakTip 72 – Linux Terminal 101: My Top Best Resources appeared first on Technolust since 2005.

Hak5 1216 – Android Hacking with the USB Rubber Ducky
Dec 5th 2012, 07:00

This time on the show, hacking Android with the USB Rubber Ducky. Darren revisits the original Human Interface Device Attack tool and shows off the 4th generation hardware. Plus, improve your Ubuntu boxes performance with a few simple tips - Shannon reports. All that and more, this time on Hak5!

Download HD Download MP4

Revisiting the USB Rubber Ducky and Lethal Android Payloads

A lot has happened since we first introduced the USB Rubber Ducky hardware a little over a year ago. We excelled in some areas, fell flat in others, and over time with the help of the community come close to where the project should be.

First, a little background. The USB Rubber Ducky concept is quite simple - violate the inherent trust the computer has in the human. If you can gain physical access to a machine, even for just a few seconds, you should be able to inject a payload at extreme speed using just keystrokes. This is done with relative ease given the fact that all computers, since the beginning, have trusted keyboards as they represent human input. The USB HID class allows us to mimic a keyboard while injecting preprogrammed keystrokes.

The project started as a proof of concept using a development board called the USB Teensy. This small Arduinio clone could perform a HID attack, as demonstrated on early episodes of Hak5. Darren shared his USB Rubber Ducky prototype off to IronGeek at Shmoocon 2010 and a month later the cat was out of the bag. IronGeek recreated the attack using the Teensy and demo'ed it, crediting Hak5, at OuterZone in March. That month the USB Rubber Ducky prototype was demoed on Hak5 and a development team was kickstarted by sending 100 boards to developers around the world.

Based on feedback from these developers we came to a few conclusions. In order for the USB Rubber Ducky to be a success we needed to make it simple. Rather than program and flash a device using C code, we developed a scripting language which could be written in standard text files. A cross-platform program would convert the text file into a binary to be moved onto the root of a micro SD card. With the micro SD card inserted into the USB Rubber Ducky the HID attack was ready for deployment. To further the enhance the USB Rubber Ducky as a covert HID attack tool it was fitted with a generic USB flash drive case. The custom hardware USB Rubber Ducky was born.

The first generation USB Rubber Ducky wasn't without some serious issues to overcome. The small batch PCB assembly was at such a high cost that the initial retail release was $80 - three times that of an adequately equipped teensy. The latch holding the microSD card could inadvertently spring open in use. The firmware was only able to attack Windows targets, and the ducky script encoder only supported US keyboards. The later was a huge oversight by the US-centric development team. To make matters worse, licensing issues encumbered the timely open sourcing of the firmware.

What had started as a modest hardware project turned out to be a nightmare. Developers were unhappy with the lack of source code, the high price and the compatibility problems. The ducky team tried several firmware fixes only to fall flat and waste time. Eventually the licensing restrictions were overcome and the source code was produced on github.

Since then the promise of community development has shown its power. One developer in particular, Midnight Snake, took on two of the most challenging issues -- cross platform compatibility and international language support. During this time Hak5 worked on several hardware revisions of the USB Rubber Ducky, replacing the faulty microSD card latch with a slot and finding ways to lower the costs of production.

So far there have been four hardware revisions. The first (black) debuted at $80 while between the second (red), third (white) and currently fourth (green) the hardware has finally come down to half the cost as it was at launch.

Furthermore several enhancements have been made to the way payloads are generated. At first a wiki and forum were setup to share payloads. Several have been shining examples of the USB Rubber Ducky's power - like the four line wget & execute from PowerShell by Mubix, or the Windows 7 backdoor and 15 second reverse shell.

To simplify payload writing process several of the most popular payloads have been adapted to the online generator at usbrubberducky.com. Simply fill in the blanks, click generate and receive a bin file ready for use on the USB Rubber Ducky.

Android hacking has also debuted. Following the introduction of Kos' (kos.io) P2P-ADB attack, and the subsequent Micro-to-Micro OTG or ""Kos Cable"" we made him, we're excited to publish a few useful Android payloads. The first enables developer mode and USB debugging, perfect for use with Kos' P2P-ADB attacks, while another simply adds an open WiFi access point to the device so Android can more easily be friends with the WiFi Pineapple.

A tremendous amount of progress has been made over the last year and it's thanks in most part to the USB Rubber Ducky community who has continued to support the platform. With a lot of the bugs worked out, costs reduced and process made even more simple we're very excited to see what's in store for the next generation of the USB Rubber Ducky."

How to Speed Up Ubuntu

The UpUbuntu blog has a bunch of useful info for Ubuntu users, and here are some handy features that you can use to speed up older versions of the OS or older computers.
1. The daemon Preload stores commonly used apps in the background in a cache so that they can be called quickly with faster load times. It monitors applications that users run, and by analyzing this data, it fetches those binaries and their dependencies into memory for faster startup times. To install: sudo apt-get install preload. Preloads default settings are good but if you want to update them we have the link to the UpUbuntu blog in the shownotes.
2. AutoClean your APT cache with this command: sudo apt-get autoclean. This will clean the cache of all the OLD files.
Why? Old package downloads store cache's in apt, and this cache will grow overtime. This will take up lots of space and slow down the computer. sudo apt-get clean will clean the cache entirely.
3. Disable some of the StartUp Applications via the Unity Dash or install Boot Up Manager (BUM) to disable services: sudo apt-get install bum.
4. Check your used Swappiness with: cat /proc/sys/vm/swappiness. This parameter controls the processes moving from physical memory to a swap disk. Because disks are slower than RAM, this can lead to a slower machine. The default value is 60, to change it, edit this file: sudo gedit /etc/sysctl.conf. Search for this line (if not present, just add it): vm.swappiness=10. Save your file and exit. Changes will take effect once you reboot your system. The higher your value (between 0-100), the more the system will swap. So if you chose 100, the kernel will always find inactive memory pages and swap them out.
5. Turn off hibernation with this command gedit: sudo gedit /etc/initramfs-tools/conf.d/resume. Comment this line: RESUME=UUID=**** by adding a #- #RESUME=UUID=***. Save and reboot.
6. Disable the Grub2 boot menu by editing: sudo gedit /etc/default/grub then searching for GRUB_TIMEOUT=0 (change it to zero). Hold down SHIFT while rebooting to show the Grub2 menu if need be. Since Grub loads it's configurations at startup, it can slow your machine's start time.
7. You can optimize a PC with low RAM by using ZRAM, which creates a compressed block device mimicking a swap disk but compressed and stored in memory to reduce disk thrashing. Run these commands: sudo add-apt-repository ppa:shnatsel/zram, then sudo apt-get update, then sudo apt-get install zramswap-enabler.
8. Remove visual effects using Compizconfig Settings Manager. sudo apt-get install compizconfig-settings-manager. Start it now and head to the Effects section, then disable all enabled effects.
9. Use system RAM for the /tmp read/write operations. Edit sudo gedit /etc/fstab. At the end of the file, add these two lines: # Move /tmp to RAM, tmpfs /tmp tmpfs defaults,noexec,nosuid 0 0
10. Use a faster desktop environment like XFCE: sudo apt-get install xubuntu-desktop, Gnome, KDE: sudo apt-get install kubuntu-desktop, LXDE: sudo apt-get install lxde, Enlightenment: sudo apt-add-repository ppa:hannes-janetzek/enlightenment-svn, sudo apt-get update, sudo apt-get install e17, Pantheon: sudo add-apt-repository ppa:elementary-os/daily, sudo add-apt-repository ppa:nemequ/sqlheavy, sudo apt-get update, sudo apt-get install pantheon-shell, or Cinnamon: sudo add-apt-repository ppa:gwendal-lebihan-dev/cinnamon-stable, sudo apt-get update, sudo apt-get install cinnamon."

Download Tips to Speed Up Old Computers

The post Hak5 1216 – Android Hacking with the USB Rubber Ducky appeared first on Technolust since 2005.

You are receiving this email because you subscribed to this feed at blogtrottr.com.

If you no longer wish to receive these emails, you can unsubscribe from this feed, or manage all your subscriptions