BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//University of California\, Berkeley//UCB Events Calendar//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
DTSTART:19701029T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:19700402T020000
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20190220T192538Z
DTSTART;TZID=America/Los_Angeles:20190227T150000
DTEND;TZID=America/Los_Angeles:20190227T160000
TRANSP:OPAQUE
SUMMARY:Statistical Physics\, Markov Chains\, and Programmable Matter
UID:124066-ucb-events-calendar@berkeley.edu
ORGANIZER;CN="UC Berkeley Calendar Network":
LOCATION:1011 Evans Hall
DESCRIPTION:Sarah Cannon\, UC Berkeley\n\nI will discuss how tools from statistical physics used to analyze partition functions\, such as Peierls arguments and the cluster expansion\, can be used to solve seemingly unrelated distributed computing problems about programmable matter. Programmable matter is a material or substance that has the ability to change its features in a programmable\, distributed way\; examples are diverse and include robot swarms and smart materials. We study an abstraction of programmable matter where particles independently move on a lattice according to simple\, local algorithms. We want to design these algorithms so that the system has a desired collective behavior\, such as compression of the particles into a shape with small perimeter or separation of differently colored particles. In our stochastic approach\, we describe a desired collective behavior using an energy function\; design a Markov chain that uses local moves and converges to the Gibbs distribution for this energy function\; and then turn the Markov chain into an asynchronous distributed algorithm that each particle can execute independently. To prove our algorithms are correct\, we must show this Gibbs distribution has the desired properties with high probability. Our previous work on the compression problem used Peierls arguments to analyze the Gibbs distribution. More recent work on the separation problem necessitated the introduction of the cluster expansion to analyze the Gibbs distribution. The key feature of the cluster expansion we use is that we can separate partition functions into volume and surface terms that we can deal with separately. Joint work with Joshua J. Daymude\, Cem Gokmen\, Dana Randall\, and Andrea Richa.
URL:http://events.berkeley.edu/index.php/calendar/sn/pubaff.html?event_ID=124066&view=preview
SEQUENCE:0
CLASS:PUBLIC
CREATED:20190220T192538Z
LAST-MODIFIED:20190220T192538Z
X-MICROSOFT-CDO-BUSYSTATUS:BUSY
X-MICROSOFT-CDO-INSTTYPE:0
X-MICROSOFT-CDO-IMPORTANCE:1
X-MICROSOFT-CDO-OWNERAPPTID:-1
END:VEVENT
END:VCALENDAR