Monthly Archives: October 2013

Tears of Steel

I will use Tears of Steel to perform the FFMPEG work on the Dell R420.
They share  the movie in several formats and sizes to download.

“Tears of Steel” — project Mango — is Blender Foundation’s fourth short film project, with as sole purpose to improve and validate Blender’s open source 3D VFX pipeline. The entire film been processed and created by free/open source software, from initial camera tracking to the final edit and grading. In October 2012, the film and all its assets will be released as Creative Commons Attribution.

Uncompressed files
http://media.xiph.org/tearsofsteel/

Full Movie – New version (4k rendered)

  • HD 1920 pixels wide (~700MB, mov, 2.0)
  • 4K 3840 pixels wide (6.3 GB, mov)
  • 4K DCP 4096 x 2160 (14 GB, Digital Cinema Package)

Full Movie – First version (HD rendered)

  • HD 1080p (~560MB, mov, 2.0)
  • HD 1080p (~550MB, MKV, 2.0)
  • HD 1080p (~545MB, webm, 2.0)
  • HD 720p (~365MB, mov, 2.0)
  • HD 720p (~365MB, MKV, 2.0)

Subtitles (.srt)

 

Some links
http://www.imdb.com/title/tt2285752/
http://www.youtube.com/watch?v=R6MlUcmOul8
http://www.youtube.com/watch?v=WwHux5QZfC8

 

National Security Agency has secretly infiltrated the main communications

Detail of an internal “NSA presentation slide” published by the Post. The sketch shows where the public Internet meets the private cloud maintained by Google, and points out — with a smiley face — that the data within the cloud is unencrypted (though Google is now working to encrypt such information).
(Credit: The Washington Post)

The National Security Agency has secretly infiltrated the main communications links connecting worldwide data centers of Yahoo and Google, Washington Post’s Barton Gellman and Ashkan Soltani report.

Read more: http://www.businessinsider.com/nsa-infiltrated-both-yahoo-and-google-2013-10#ixzz2jENRzVcf

aspera and zencoder

Zencoder is the performance leader in cloud-based video encoding, with the fastest and most widely used solution in the market. Fast encoding speeds, extreme scalability, and an easy API integration enable content providers to quickly deploy Internet video to consumers on virtually any Internet connected device. Seamless integration with Aspera high-speed transfers ensures the fastest content ingest significantly shortening end-to-end production time.

JOINT FEATURE HIGHLIGHTS

  • Aspera fasp embedded directly within Zencoder
  • Ensures maximum transfer speed of the media files and immediate start and parallel processing of encoding jobs

“We’re seeing upwards of 500 megabits per second of throughput up to the cloud, which, depending on some variables, is an incredible improvement above just standard FTP or HTTP.” A transfer of a one-hour DNxHD video file that could take 10 hours over FTP, now completes in less than 26 minutes using Aspera. On top of high-speed transfers, Aspera’s patented fasp transfer technology encrypts files in transit and at rest, assuring bulletproof security for business-critical content.

Source: http://asperasoft.com/fileadmin/media/Case_Studies/Zencoder_AsperaCS.pdf

Building an R Hadoop System

After reading documents and tutorials on MapReduce and Hadoop and playing with RHadoop for about 2 weeks, finally I have built my first R Hadoop system and successfully run some R examples on it. My experience and steps to achieve that are presented at http://www.rdatamining.com/tutorials/rhadoop. Hopefully it will make it easier to try RHadoop for R users who are new to Hadoop. Note that I tried this on Mac only and some steps might be different for Windows.

1. Install Hadoop
2. Run Hadoop
3. Install R
4. Install RHadoop
5. Run R jobs on Hadoop
6. What’s Next

Source: http://www.rdatamining.com/tutorials/rhadoop

Not yet test, but in the next 2/3 weeks I will!

YouTube Wordcount MapReduce in R
http://www.youtube.com/watch?v=hSrW0Iwghtw

The Open Compute Project

Facebook has been able to quantify energy efficiency gains of 38% for new servers conforming to the specs of The Open Compute Project, said Matt Corddry, Director of Hardware Engineering at Facebook, speaking at the Open Server Summit in Santa Clara, California.  Moreover, the new servers deliver a 24% cost savings compared to generic OEM servers

The Open Compute Project, which Facebook launched in April 2011, has resulted in vastly simplified Compute Servers, Storage JBODs and an innovative Open Rack System.

 Source: http://www.convergedigest.com/2013/10/open-server-summit-open-compute.html

 

cdnjs.com, 542 js libraries served by CloudFlare CDN

http://cdnjs.com/ hosts an HUGE LIST of js libraries  – 542 – served by CloudFlare CDN!
Hooray!

Everyone loves the Google CDN right? Even Microsoft runs their own CDN.
The problem is, they only host the most popular libraries.
We host it all – JavaScript, CSS, SWF, images, etc!

(..)

At CloudFlare, we believe that open source, community-driven projects like CDNJS are the tools upon which the future of the internet will be built. CloudFlare is proud to provide the global CDN infrastructure that will help power that future.

Wikimedia Foundation Servers

Theres an huge collection of Wikipedia’s data center photos taken by Vitor Grigas, taken in 2012.

https://commons.wikimedia.org/wiki/Category:Wikimedia_servers_in_2012

Servers are located at Equinix.

Equinix hosts most of the infrastructure for Digg and Salesforce.com, and provides colocation and interconnection services to a lengthy list of marquee customers, including Google, Yahoo, IBM, America Online, Akamai, Electronic Arts, GE and Merrill Lynch.
Source: http://www.datacenterknowledge.com/archives/2009/06/09/the-internets-busiest-intersection/

Continue reading Wikimedia Foundation Servers