iNNEXT SNES Retro USB Super Nintendo Controller Gamepad Joystick, USB PC Super Classic Controller Joypad Gamestick for Windows PC MAC Linux Android Raspberry Pi 3 Steam Sega Genesis Higan (2 Pack)

By | iot, machinelearning

Reclaim your childhood by iNNEXT USB SNES / NES Controller

Growing up doesn’t mean you have to abandon all facets of your childhood, and iNNEXT USB SNES / NES Controllers are perfect for telling your friends that age is just a number.

iNNEXT. Recently completed a newest USB NES controller that is functionally almost like the original NES gamepad we grew up with on Windows PC.

Bring Back Your Youth Memories With iNNEXT USB SNES / NES Controller

Do you still remember the old school SNES and NES video games? As an old school gamer, you must played the old games: Super Mario, Teenage Mutant Ninga Turtles, Contra(Operation C), Ninja Gaiden, The Legend of Zelda, DOUBLE DRAGON£¬ Downtown – Nekketsu Monogatari£¬ Tetris ¡¤¡¤¡¤

This controller is a Generic USB Controller, it uses a standard USB port, if your program or application accepts USB controller input, it can be used natively without drivers or patches, just plug and play. It works with any emulator you wish to download and use. Google SNES emulator and then the same for Roms.

Performs Exceptionally Well on Raspberry Pi

With this usb controller, your raspberry pi could act as a retro gaming machine.

High Compatibility

This controller works with: Windows 98/ME/Vista/2000/2003/XP/7/8/8.1/10; Mac OS X and beyond ;Nintendo emulators ;Raspberry Pi ;Raspberry PI 2 model B ;Retro Pi OS; Retropi System. You don’t need to install any special software.

Superior Quality

100% Brand New! Super sensitive buttons for precision control with authentic retro feel. Cord is approx. 5 ft. Long

Reliable Warranty

100% Money Back Guarantee; hassle free 6-month replacement warranty with friendly and professional customer service. If you have any problem, kindly please feel free to contact us, we will do our best to help you!

Package :

2 x iNNEXT SNES Retro USB Super Nintendo Controllers (* * *Packed in iNNEXT Box* * *)★ Generic USB controller, this uses a standard USB port, if your program or application accepts USB controller input, it can be used natively without drivers or patches, JUST PLUG AND PLAY! Cord is approx. 5 ft. Long. Super sensitive buttons for precision control. Third party controller, not original SNES / NES controller. But it works phenomenal with the Raspberry Pi game emulation and so on.
★ Supported Operating system: Windows 98, ME, Vista, 2000, 2003, XP, 7, 8, 8.1, 10; Linux UBUNTU, Linux Mint, Android Linux (via an USB OTG cable); Mac OS X and beyond; Retrogaming operating systems: RetroPie, Recalbox, Happi Game Center, Lakka, ChameleonPi, Piplay
★ Supported Device (With USB storage device): PC, Notebook Computer, Laptop Computer; MacBook; Android Smartphone: connect via an USB OTG cable; Raspberry Pi (RPI, Raspberry pie): Raspberry Pi 1 Model B, Model A, Raspberry Pi 1 Model B+, Raspberry Pi Zero, Raspberry Pi 2, Raspberry Pi 3 Model B, Raspberry Pi 3, Retro Pie
★ Supported Platform: Steam (Not Support all games. Perfect for Several simple steam games)
★ Supported Game Emulators: NES, SNES, Snes9x, ZSNES, Higan, Handheld GBA emulator, Sega Genesis emulator, Sega OpenEmu (Only support OS X), RetroArch (Works perfect on Android), BSNES (Perfect for two player action)…


Capitalist science: The solution to the replication crisis?

By | machinelearning

Bruce Knuteson pointed me to this article, which begins:

The solution to science’s replication crisis is a new ecosystem in which scientists sell what they learn from their research. In each pairwise transaction, the information seller makes (loses) money if he turns out to be correct (incorrect). Responsibility for the determination of correctness is delegated, with appropriate incentives, to the information purchaser. Each transaction is brokered by a central exchange, which holds money from the anonymous information buyer and anonymous information seller in escrow, and which enforces a set of incentives facilitating the transfer of useful, bluntly honest information from the seller to the buyer. This new ecosystem, capitalist science, directly addresses socialist science’s replication crisis by explicitly rewarding accuracy and penalizing inaccuracy.

The idea seems interesting to me, even though I don’t think it would quite work for my own research as my work tends to be interpretive and descriptive without many true/false claims. But it could perhaps work for others. Some effort is being made right now to set up prediction markets for scientific papers.

Knuteson replied:

Prediction markets have a few features that led me to make different design decisions. Two of note:
– Prices on prediction markets are public. The people I have spoken with in industry seem more willing to pay for information if the information they receive is not automatically made public.
– Prediction markets generally deal with true/false claims. People like being able to ask a broader set of questions.

A bit later, Knuteson wrote:

I read your post “Authority figures in psychology spread more happy talk, still don’t get the point . . .”

You may find this Physics World article interesting: Figuring out a handshake.

I fully agree with you that not all broken eggs can be made into omelets.

Also relevant is this paper where Eric Loken and I consider the idea of peer review as an attempted quality control system, and we discuss proposals such as prediction markets for improving scientific communication.

The post Capitalist science: The solution to the replication crisis? appeared first on Statistical Modeling, Causal Inference, and Social Science.

Source link

How American Airlines is Innovating Customer Travel with IBM Cloud

By | iot

As the world’s largest airline, American Airlines is committed to providing an exceptional experience for our customers and team members based on industry expertise and innovation. That means we must continuously modernize and transform our business by developing the right tools and technology along the way.

Technology was the foundation of American’s relationship with IBM when in the 1950’s we worked together to develop the first electronic reservation and ticketing system in the industry. Today, technology touches every part of our business. Every customer logging onto, checking-in at our kiosks, or managing their day-of travel on our mobile app, is supported by an incredible amount of technology to give our customers and team members alike the services they need, exactly when and where they need them.

We began seriously considering cloud computing a couple of years ago using key guiding principles to help us choose our path. First, American is committed to open standards. We chose Cloud Foundry because they give us the ability to move our workloads around based on our needs, and without the extra burden of being locked into one provider or another. Second, security is a non-negotiable requirement in our cloud provider selection. Our partner needs to have not only robust security offerings, but deep experience in providing secure computing to all industries, especially government and federal institutions. Third, American’s first choice is to leverage public cloud for the scalability, cost and innovation benefits it brings. Then, if public isn’t a fit for a particular workload, we will look at other environments as necessary.

By leveraging the efficiencies of the cloud, American could be nimble enough to fully realize the benefits of the platform. We saw potential for huge, repeatable returns in increased productivity, agility, and responsiveness when the entire ecosystem was addressed, all of which are necessary for driving timely innovation. It also became clear that in order to achieve those returns, American needed a partner that took a broader approach to cloud innovation than just the platform.

After a very extensive search, American chose IBM as our strategic cloud partner, and we announced last year our intent to migrate some of our most critical enterprise applications to the cloud. IBM brings solid, scalable, secure public cloud infrastructure services, in addition to a robust application services marketplace. They are committed to open standards, and by selecting Cloud Foundry as their foundation, IBM gives us the flexibility we need to move our workloads at-will across our environments. IBM continues to make significant investments in their cloud architecture, and their next generation cloud computing roadmap is very exciting and validates we made the right cloud decision.

We also choose IBM as our strategic cloud partner for their long-standing history of helping enterprises transform their organizations to better leverage technology for rapid innovation. IBM is helping American address our organization and processes so that we can take full advantage of the productivity, speed and agility benefits of cloud computing. They are bringing forward content experts, tools and assets that may be used to enable our teams to quickly adopt and adapt to new, exciting ways of working together in the cloud. This includes leveraging IBM’s Garage Method, Design Thinking, DevOps, continuous delivery, agile methodology, squad / tribe models based in the Spotify method, paired programing and lean development.

American and IBM are partnering to migrate several of our critical enterprise applications including, our customer Mobile App and our Kiosk app from three distinct applications today into a single set of micro-services on IBM Cloud’s Platform as a Service.

Based on a partnership of innovation that started over a half a century ago, our journey to the cloud with IBM is another significant milestone in our long-standing relationship. American Airlines and IBM will work together, again, to bring solutions that delight our customers and bring important innovations to the airline industry.

The post How American Airlines is Innovating Customer Travel with IBM Cloud appeared first on THINK Blog.

Source link

Significant Digits For Tuesday, June 27, 2017

By | ai, bigdata, machinelearning

You’re reading Significant Digits, a daily digest of the numbers tucked inside the news.

23 percent

Percentage of traditional retail jobs in the U.S. located in rural and small metropolitan areas. On the other hand, only 13 percent of e-commerce jobs are in rural or small metros. As the retail business moves increasingly to urban centers and online, the bottom may be falling out for the industry in rural areas already hit by manufacturing closures. [The New York Times]

36.9 kilograms

Seizures of the potent narcotic fentanyl coming into the U.S. from international and express mail carriers in the 2016 fiscal year. In 2013, it was 1.08 kilograms. [The Wall Street Journal]

62 percent

Percentage of Americans who support allowing gays and lesbians to marry legally, with 32 percent opposed. In 2011, the nation was split on the issue 46 percent to 44 percent. In 2007, 37 percent supported same-sex marriage and 54 percent opposed. [Pew Research Center]

93 percent

GOP Sen. Dean Heller of Nevada could be the deciding vote on the Senate bill to roll back the Affordable Care Act. He’s voted in line with President Trump’s stated position on legislation 93 percent of the time thus far. [FiveThirtyEight]

$772 billion

The nonpartisan Congressional Budget Office released its analysis of the GOP’s Senate health care bill on Monday. The Better Care Reconciliation Act of 2017 would cut $772 billion from Medicaid over the next decade, and 15 million fewer people would be on the program’s rolls. Overall, 28 million Americans are projected to be without insurance in 2026 under current law; the CBO says it would be 49 million people under the Senate bill. [FiveThirtyEight]

1.2 trillion photos

Estimated number of digital photographs that will be taken this year worldwide. Thanks to the huge growth in mobile adoption, 85 percent of them will be taken with a phone. A mere 400 billion digital photos were taken in 2011, and just half were taken with a phone. [Recode]

If you see a significant digit in the wild, send it to @WaltHickey.

Source link

Chicago Fed “Index Points to Slower Economic Growth in May”

By | ai, bigdata, machinelearning

From the Chicago Fed: Chicago Fed National Activity Index Points to Slower Economic Growth in May

Led by declines in production-related indicators, the Chicago Fed National Activity Index (CFNAI) moved down to –0.26 in May from +0.57 in April. Three of the four broad categories of indicators that make up the index decreased from April, and three of the four categories made negative contributions to the index in May. The index’s three-month moving average, CFNAI-MA3, declined to +0.04 in May from +0.21 in April.
emphasis added

This graph shows the Chicago Fed National Activity Index (three month moving average) since 1967.

Chicago Fed National Activity Index Click on graph for larger image.

This suggests economic activity was close to the historical trend in May (using the three-month average).

According to the Chicago Fed:

The index is a weighted average of 85 indicators of growth in national economic activity drawn from four broad categories of data: 1) production and income; 2) employment, unemployment, and hours; 3) personal consumption and housing; and 4) sales, orders, and inventories.

A zero value for the monthly index has been associated with the national economy expanding at its historical trend (average) rate of growth; negative values with below-average growth (in standard deviation units); and positive values with above-average growth.

Source link

Recent Developments in Cointegration

By | ai, bigdata, machinelearning

(This article was originally published at Econometrics Beat: Dave Giles’ Blog, and syndicated at StatsBlogs.)

Recently, I posted about a special issue of the journal, Econometrics, devoted to “Unit Roots and Structural Breaks”.
Another recent special issue of that journal will be of equal interest to readers of this blog. Katerina Juselius has guest- edited an issue titles, “Recent Developments in Cointegration“. The papers published so far in this issue are, of course, open-access. Check them out!

© 2017, David E. Giles

Please comment on the article here: Econometrics Beat: Dave Giles’ Blog

The post Recent Developments in Cointegration appeared first on All About Statistics.

Source link

bigrquery 0.4.0

By | ai, bigdata, machinelearning

(This article was first published on RStudio Blog, and kindly contributed to R-bloggers)

I’m pleased to announce that bigrquery 0.4.0 is now on CRAN. bigrquery makes it possible to talk to Google’s BigQuery cloud database. It provides both DBI and dplyr backends so you can interact with BigQuery using either low-level SQL or high-level dplyr verbs.

Install the latest version of bigrquery with:


Basic usage

Connect to a bigquery database using DBI:


con <- DBI::dbConnect(dbi_driver(),
  project = "publicdata",
  dataset = "samples",
  billing = "887175176791"
#> [1] "github_nested"   "github_timeline" "gsod"            "natality"       
#> [5] "shakespeare"     "trigrams"        "wikipedia"

(You’ll be prompted to authenticate interactively, or you can use a service token with set_service_token().)

Then you can either submit your own SQL queries or use dplyr to write them for you:

shakespeare <- con %>% tbl("shakespeare")
shakespeare %>%
  group_by(word) %>%
  summarise(n = sum(word_count))
#> 0 bytes processed
#> # Source:   lazy query [?? x 2]
#> # Database: BigQueryConnection
#>            word     n
#>  1   profession    20
#>  2       augury     2
#>  3 undertakings     3
#>  4      surmise     8
#>  5     religion    14
#>  6     advanced    16
#>  7     Wormwood     1
#>  8    parchment     8
#>  9      villany    49
#> 10         digs     3
#> # ... with more rows

New features

  • dplyr support has been updated to require dplyr 0.7.0 and use dbplyr. This means that you can now more naturally work directly with DBI connections. dplyr now translates to modern BigQuery SQL which supports a broader set of translations. Along the way I also fixed a vareity of SQL generation bugs.
  • New functions insert_extract_job() makes it possible to extract data and save in google storage, and insert_table() allows you to insert empty tables into a dataset.
  • All POST requests (inserts, updates, copies and query_exec) now take .... This allows you to add arbitrary additional data to the request body making it possible to use parts of the BigQuery API that are otherwise not exposed. snake_case argument names are automatically converted to camelCase so you can stick consistently to snake case in your R code.
  • Full support for DATE, TIME, and DATETIME types (#128).

There were a variety of bug fixes and other minor improvements: see the release notes for full details.


bigrquery a community effort: a big thanks go to Christofer Bäcklin, Jarod G.R. Meng and Akhmed Umyarov for their pull requests. Thank you all for your contributions!

var vglnk = { key: ‘949efb41171ac6ec1bf7f206d57e90b8’ };

(function(d, t) {
var s = d.createElement(t); s.type = ‘text/javascript’; s.async = true;
s.src = “”;
var r = d.getElementsByTagName(t)[0]; r.parentNode.insertBefore(s, r);
}(document, ‘script’));

To leave a comment for the author, please follow the link and comment on their blog: RStudio Blog. offers daily e-mail updates about R news and tutorials on topics such as: Data science, Big Data, R jobs, visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more…

Source link