09 March 2018

Not so Flickrless Friday

Another Flickr Friday, another bust in locating a suitable image. What to do this time? A few months ago I featured Flickrless Friday (December 2017). Now I have to find a different angle. I know! Let's have a quiz. Question: What do the four Flickr 'chess' photos shown below all have in common?


Photo top left: Beer Can House © Flickr user Thomas Hawk under Creative Commons.

Answer: They all have a white tag that says 'chess'. As I explained for one of the images in the 'Flickrless Friday' post,

The only association with chess is via a white tag assigned by Flickr, i.e. if the image looks like a group of chess pieces, let's assign it to 'chess'.

Those white tags are now infamous. The headline of one news article said, Flickr’s auto-tagging feature goes awry, accidentally tags black people as apes, (independent.co.uk; May 2015) along with the subtitle 'The site’s tool was built to help people easily identify features of pictures -- but has run into problems as it learns'. No kidding! The article went on to explain,

Though the racist implications were obvious, it has also identified a white women [sic] with the same tag.

If I had been in charge of that project, I would have pulled it immediately and insisted on zero classification errors when identifying people. Imagine the potential for lawsuits. The same article said later,

Flickr launched the features a couple of weeks ago. The team behind it explained to the Independent just before the launch that it uses "convolutional neural networks", or computers that act like human brains, to identify the photos.

A convolutional neural network (CNN) is also the key to the technology behind Deepmind's AlphaZero. When people talk about artificial intelligence (AI), they are often referring to a CNN.

Here are links (photos left to right, top to bottom) to the Flickr pages associated wth the four photos I selected. I repeated the first link to be consistent.

If I ever run into a Flickrless Friday again, I'll have to think up something really special.

***

Later: While working on the next Flickr Friday post, The Noyon Chess Pieces, I looked at a number of photos returned by Flickr search but which had no visible mention of chess. Then I noticed that the white 'chess' tags had disappeared from the photos used in this 'Not so Flickrless Friday' post. They are apparently still present and being used for search.

08 March 2018

Stockfish in a Straitjacket?

It was one of those coincidences you can never plan. Near the end of last year Houdini won TCEC Season 10 at the same time that AlphaZero appeared on the scene. I covered both of those significant computer chess events in a single post, Houdini, Komodo, Stockfish, and AlphaZero (December 2017). The first three names are the top three chess engines in the world, of roughly equal strength, but AlphaZero had apparently crushed one of the trio in a match. In my post I wrote,

We can quibble about whether the AlphaZero - Stockfish match was indeed a fair fight -- 1 GB hash size is a severe restriction -- but the final score of +28-0=72 for AlphaZero was more than convincing to all but the most vehement skeptics.

I was reminded of those words while writing my most recent post, TCEC Season 11 in Full Swing. One of the sources I consulted, without referencing it in the post, was TCEC 11: Premier Division starts (chessbase.com; February 2018). The Chessbase site is well known and well respected for its expertise in computer chess and always attracts comments from informed readers. This particular article launched a discussion on why AlphaZero wasn't participating in TCEC Season 11 and whether the AlphaZero - Stockfish match had been too heavily rigged in AlphaZero's favor. The discussion mentioned four factors that could have hurt Stockfish's performance:-

  • Restricted hash size
  • Fast time control
  • No opening book
  • No endgame tablebases

I knew that the first two points were an issue, but wasn't certain if the last two were true. I went back to the Deepmind paper that had announced AlphaZero to the world (titled 'Mastering Chess and Shogi by Self-Play with a General Reinforcement Learning Algorithm') and re-read the relevant section:-

Evaluation • To evaluate performance in chess, we used Stockfish version 8 (official Linux release) as a baseline program, using 64 CPU threads and a hash size of 1GB. [...] The Elo rating of the baseline players was anchored to publicly available values. We also measured the head-to-head performance of AlphaZero against each baseline player. Settings were chosen to correspond with computer chess tournament conditions: each player was allowed 1 minute per move, resignation was enabled for all players (-900 centipawns for 10 consecutive moves for Stockfish and Elmo, 5% winrate for AlphaZero). Pondering was disabled for all players.

Since 'publicly available [Elo] values' depend both on configuring engines properly and on a level playing field, I started to have serious concerns that this controversy was more than a quibble. What did the Stockfish developers think about the match? On Stockfish's Fishcooking forum, in a long thread titled Open letter to Google DeepMind (December 2017), the opening message said,

AlphaZero won the 100 game match against Stockfish very impressively by a total score of 28 wins and 72 draws and 0 [losses]. This translates to an Elo difference of 100. However the details of the match described in your paper show that this match might have been much closer and more interesting had it not been for some IMO rather unfair conditions.

That first post and the subsequent discussion repeated the four complaints from the Chessbase comments listed above, and added,

In the match version 8 of Stockfish was used which is now over a year old. The latest version of Stockfish is over 40 Elo stronger in fast self play.

That makes five significant objections to the conduct of the match. Later in the same Fishcooking thread, TCEC insider Nelson Hernandez wrote,

This "match" was like a boxing match where one fighter had no seconds in his corner, the referee and judges were picked by his opponent, there was no audience to validate what happened in the ring as it happened, and the post-match story was written by the opponent's hirelings. It may well be that Alpha Zero is indeed better than the latest version of Stockfish in fair test conditions. But it is almost criminal to announce very biased test results such as these, thereby rubbishing the work of hundreds of people, in order to gain some PR benefit. What the computer chess community expects is fairness and decency.

The clincher to the above discussion is that three months have passed since Deepmind's bombshell announcement, which made available only ten games from the match. None of the other 90 games have been released for dissection by the experts. AlphaZero might be a better chess engine than Stockfish, but it might also be much worse. If we can't have a match where the Stockfish developers configure their creation for its full strength, let's have the other games from the first match.

06 March 2018

TCEC Season 11 in Full Swing

Two world class computer championships in a twelve month period? Less than six months ago on this blog we had TCEC Season 10 Kickoff (September 2017), where I wrote,

Fans of engine-to-engine play -- and who isn't? -- know that the TCEC (Top Chess Engine Championship) is the toughest tournament of them all. Many consider it to be the real World Championship of chess engines. The TCEC takes place on Chessdom.com, and over the past month the site has announced plans for Season 10.

I could have used that same paragraph for this current post by changing 'Season 10' to 'Season 11'. I covered the end of TCEC Season 10 in Houdini, Komodo, Stockfish, and AlphaZero (December 2017), where I noted, 'With the score at +14-9=73 after 96 games, Houdini was declared the winner [over Komodo]'. While that season was itself in full swing, Chessdom issued a TCEC Season 11 press release (November 2017):-

Starting with its 11th season in early 2018, TCEC will adopt a league format consisting of four divisions of eight chess engines. The five divisions will be called the Premier, First, Second, Third, and Fourth Divisions. Each division will conduct a tournament which will lead to the top two engines in the Premier Division facing off in a 100-game Superfinal for the TCEC seasonal championship.

The league’s mechanics are straightforward. Divisional tournaments will be conducted in sequence from the lowest (Third Division) to the highest (Premier Division). At the end of the Third, Second and First Division tournaments the top two finishers will be promoted to the next-higher division. At the end of the Second, First and Premier Division tournaments the bottom two finishers will be relegated to the next-lower division.

That preliminary announcement was further embellished with TCEC Season 11 - information and participants (December 2017):-

TCEC Season 11 will start this January 3rd. It will involve 30 of the strongest computer chess software programs in the world. One more time the engines will be provided with a high quality hardware -- a 44 cores server -- and will compete in equal conditions to crown the strongest one in the Top Chess Engine Championship.

The last of the four divisions finished a month and a half later with Andscacs wins TCEC Division 1 (February 2018; includes links to results for lower divisions):-

With this division gold medal Andscacs earns the right to participate in the race for the TCEC title in the Premier Division, an event which will be the strongest computer chess championship in history.

This was immediately followed by the computer version of a candidates tournament: TCEC Premier Division – the strongest computer chess event in history (February 2018):-

After four divisions of exciting qualification battles, we are at the doorstep of the highest category of the Top Chess Engine Championship. The eight best chess software programs, that any professional player or aficionado can use on a home computer, are going to meet in a direct battle to determine the best of the best in the field. The eight participants include the defending champion Houdini, the vice champion Komodo, the top open source program Stockfish, as well as the challengers Fire, Ginkgo, Chiron, Andscacs, and Fizbo.

The November 2017 press release provided details about the format of the Premier Division:-

Top two Premier Division finishers advance to Superfinal competition; bottom two finishers relegated to First Division. • 6x double round-robin (engines play each other 12 times); 84 games each engine.

As I write this post, the tournment is at the start of its second half, the seventh round. The action goes on 24/7 at TCEC - Live Computer Chess Broadcast.

05 March 2018

Interview Videos : Mamedyarov

Next up in our series of interviews with the eight players taking part in the 2018 Candidates Tournament, Berlin (starts this coming weekend!), after Interview Videos : Kramnik is Azerbaijani grandmaster Shakhriyar [Shahriyar] Mamedyarov. He qualified into the Candidates event by finishing first in the 2017 Grand Prix.


Tata Steel Chess - Interview - Shakhriyar Mamedyarov - Round 13 (4:31) • 'Published on Jan 28, 2018'

The interview was conducted after the last round of the recent supertournament in Wijk aan Zee. Mamedyarov finished tied for 3rd-4th with Kramnik, after Giri and Carlsen.

Q: Was this your last tournament before the Candidates? A: No, I will play in the Tal Memorial before the Candidates -- maybe it's also not the last. I just want to play chess and not only to sit home and prepare for the Candidates.

GM Mamedyarov is currently ranked world no.2 on the March 2018 FIDE rating list. For an overview of his record against the other candidates, see Berlin Candidates - Kickoff (February 2018) on my World Chess Championship blog.

04 March 2018

Chess in Education and Health

FIDE has just issued a new booklet, Chess - A Tool for Education and Health (fide.com).

This new 48 page booklet is now freely available to download as a PDF from cis.fide.com. In this new edition, we hope to disseminate throughout the worldwide chess community the benefits of education, health, as well as the use of chess in different social and therapeutic areas. We hope it serves as a letter of introduction, not only for amateurs, instructors and teachers, but also for the entire community that wants to know the work that is being carried out with chess as an educational and socialization tool.

The booklet overlaps many of the topics that we've seen on this blog in this ongoing series about The Sociology of Chess (November 2016). Here are its first two pages, the cover and the table of contents:-

If that table of contents is too small to read, here's the list in a more readable format:-

2 Morals of Chess
3 Chess in Bloom
4 Thinking Skills
5 Educational Cutlery
6 Critical & Creative Thinking - Chess in the Educational Process
7 Chess as a Teaching Tool
8 Educational Benefits of Chess
10 Psychomotor Skills
11 STEM Skills
12 Cognitive Abilities
13 Life Skills & Counselling
14 Ethical Sense
15 ADHD & Autism
16 Social Benefi ts & Minorities
18 Health Bene fits
19 Beating Cognitive Decline
20 Smart Girl Uganda
21 Queen of Katwe
22 Prisons - Chess That Brings Freedom
26 Alzheimer's - Checkmating Dementia
28 Teaching Programs - 4-6 Early Years Skills
30 Teaching Programs - 7-11 Planet Chess and others
32 Teacher Training
33 FIDE School Instructor title
34 FIDE School Chess Leader diploma
35 Support for Teachers
36 European Parliament
38 European Union – Erasmus+
39 European Chess Union
40 CiS Around the World
42 Chess & Education Conferences
44 Research
46 Bibliography

Several of these topics are controversial, for example 'Alzheimer's - Checkmating Dementia', a topic I last covered in More on Chess and Alzheimer's (July 2016). The FIDE booklet says,

Research among those over the age of 60 strongly suggests that chess is valuable in combating Alzheimer's.

The phrase 'strongly suggests' is less provocative than the usual phrase 'studies show', and I imagine that FIDE is making an effort to avoid adding fuel to the controversy. Other topics in the booklet show similar circumspection. For more about these topics on this blog, follow the links for 'Chess in School' Summarized (October 2016), and FIDE's Social Commissions 2017 (November 2017).

02 March 2018

Magnus Streams on Youtube

Nearly four hours of World Champion Magnus Carlsen talking about his own games in progress. What's not to like?


Special guest Magnus Carlsen streaming his PRO Chess League games (3:56:10) • 'Published on Feb 25, 2018'

From the Jon Ludvig Hammer channel, GM Hammer is an official friend of Magnus. The hundreds of comments make up for the lack of a description on the clip. Take this comment, for example:-

Wow, this is a revolution in the history of chess. Being able to watch the World Champion and leading chess player over the last 10 years, sit in a comfortable environment and play and comment on both his own and others' games, is a huge source for insight. Thank you so much for sharing this Jon Ludvig, I think this video will live a long life on Youtube.

The first game on the video can be found at Magnus Carlsen vs Roman Yanchenko, Pro Chess League 2018 (chessgames.com; GM Carlsen opens 1.h3). The other games can be found by following that link.

01 March 2018

March 1968 'On the Cover'

Every month, the 'On the Cover' series goes back 50 years for a glimpse at what the top American chess magazines were reporting. This month marks four years since the first in the series, March 1964 'On the Cover'.


Left: 'Jerry Spann, Oklahoma City' (Badge)
Right: 'Medieval Manikins'

Chess Life

'Good Man Gone' by Ed Edmondson • There was a man, a man named Jerry. For that's how he was known to all of us -- simply, warmly, "Jerry" to his thousands of friends wherever chess is played. Jerry fought with characteristic verve and courage throughout the final months of an encounter with the toughest opponent of them all, succumbing to the last check early this year.

In the February 1968 'On the Cover', the cover of Chess Review informed us that Spann was a former USCF president. The March 1968 Chess Life included a second article, 'Legacy from Jerry' by Fred Cramer, Past President, USCF.

Chess Review

Mark Freeman reports to us: The recent "Artists as Craftsmen" exhibition at the East Side Gallery displayed the work of the gallery artists in their lighter moods. Whimsy, humor and originality marked many of the truly unique objects in this Annual.

A feature of the exhibition was a series of "Knights" by William D. Gorman. Using beach pebbles for heads and wood carving for bodies, he created beautifully crafted chess desk pieces, each of individualistic character. [...] CHESS REVIEW regrets not being able to present the set in its proper colors on its cover but is using green for St. Patrick's day.

The Merriam-Webster dictionary informs us that Manikin is a variant of 'mannequin' and 'mannikin', is 'dated, usually disparaging', means 'a little man', and has 'Popularity: Bottom 40% of words'. I don't recall the word ever being used to describe a chess set.