Please see our Winter 2012 schedule.
|Class Location & Time:||Every other Thursday, noon to 1 pm, Rm. 310 in Bissel (140 St. George St.)|
|Organizers:||Yuri Takhteyev, Quinn Dupont, and ginger coons|
|Summary:||This is a highly informal brown bag (bring-your-lunch) seminar that will to look at research related to social science research on free / open source software (understood broadly). The seminars will focus on presentation of work in progress and active discussion by the audience. The seminar is open to all people interested in free / open source software. Our focus will be on academic research, but practitioners are welcome to join the discussion. The seminar will take place every other Thursdays throughout the fall semester. If you have any questions or want to be added to our mailing list, please email yuri.takhteyev -at- utoronto.ca.|
This presentation looks at the Open Colour Standard, a F/LOSS project devoted to the democratization of spot colour in the commercial and hobbyist printing context. It will look first at the purpose and history of OCS before advancing some tentative findings on the value of reflexivity in the wider world of F/LOSS development, in addition to how hands-on experimentation relates to and conflicts with the normally decentralized habits of F/LOSS.
ginger coons is a doctoral student at the Faculty of Information at the University of Toronto.
As the toolbox is to the carpenter, software engineering is to the modern programmer. But, unlike the carpenter, we now live in a post-Fordist world. Or so the story goes. By examining the history of a single, near–ubiquitous software production tool—the source code control/versioning system—this paper reveals old modes of production in new, more subtle configurations. F/LOSS is a powerful mode of production, but by adopting (and even actively developing) software engineering tools it runs the risk of losing its most powerful asset—democratic openness.
As computers grew in popularity in the late 1950s, and software became physically removed from computing hardware, the need for trained software programmers expanded, until in 1968 it was declared that an answer to the “software crisis” was urgently required. Simultaneously, agitation and revolt against hierarchical technocracy grew, putting computing technology front and centre in the battle for democratic ways of being. The technocratic reply was to launch the field of software engineering, and within a year the first source code control tools were developed. By 1972, Marc Rochkind developed the Source Code Control system within Bell Labs and the modern mode of software production was practically cemented. The effect of these tools was similar to the effect of factory architecture, conveyor belts, and time studies to mechanical production from earlier in the 20th century.
Unlike earlier (deeply criticized) Marxist studies of contemporary production (Braverman, Burawoy, Kraft, etc.), this paper looks at production from a control perspective informed by Foucault's theory of subjectification. By contributing a politically potent history to an important software production tool this paper grapples with the issue of the “American system” that, following Daniels, many have suggested is required to properly understand the history of technology.
Quinn Dupont is a doctoral student at the Faculty of Information at the University of Toronto.
My research is premised upon the idea that the creation of public policies and the social web are both making activities, which can include citizens. This presentation will explore how the value of 'openness' is something citizen can experience through their interactions with government or design and use of socio-technical system. The context of public participation online in Ontario will be introduced briefly as well as the social and technical aspects of openness experienced by citizen-designers and shared in interviews.
Karen Smith is a doctoral student at the Faculty of Information at the University of Toronto.
This is Open Access Week, and there will be a number of presentations centered around how journals can open access to peer-reviewed and published articles. However, much of the transformative power of Open Source software comes from the adage "release early, release often". Indeed, we expect a solid open source project not only to release a polished tar-ball of source code once the product is done, but to have an open development process, with every update available on github or sourceforge, an open bug database and mailing list, etc. What would a similar paradigm look like for academic research?
This presentation will introduce early efforts at opening up science, including open notebook science and research wikis, as well as a collection of tools developed by the speaker to enable integrated sharing of notes and references from the research process. I will also discuss some of the challenges with semantic data sharing, as well as the challenges with sharing a loosely-built framework with others.
This year we started a new project taking a descriptive look at practices around preservation of old software. The longer term goal of the project is to see whether and how preservation of software can be supported by "open" approaches. In the shorter term we are looking at taking a descriptive look at preservation of software, trying to understand the different efforts taking place within this space and how such different efforts relate to each other.
The Internet is popularly referred to as a "cloud," suggesting an ethereal, placeless entity where data is stored and routed almost magically according to one's wishes. However, this metaphor belies the firmly grounded, concrete aspects of Internet operation, which are located within stubbornly material built structures at quite specific locations and subject to powerful, but largely hidden, political and economic interests. In particular, the Internet backbone is the site of increasingly extensive and intensive surveillance, most notoriously in North America by the National Security Agency (NSA) through its warrantless wiretapping program, but also in Canada too if the pending 'lawful access' legislation is passed. Such surveillance activities are deliberately shrouded in secrecy, leaving no detectable trace in the data trails they intercept.
This talk reports on an Internet exchange mapping project, known as IXmaps, and its approach to the challenge of rendering more publicly visible hidden surveillance . To overcome the lack of "official" direct evidence, we combine heterogeneous information sources such as user-generated Internet traceroute data, geo-location techniques and images of switching and data centers, to create an interactive map of Internet traffic routing displayed via Google Earth. Through "crowdsourcing" we also recruit a geographically distributed group of contributors who provide vital information by initiating traceroute probes of the Internet backbone. The result is a novel and visually compelling view into Internet "cloud" operations highlighting its physical, geographical and political characteristics.