How can non-trivial graphical user interfaces be designed in Plan 9 without them losing their minimalistic style? Different toolkits are discussed, and a proposal for a tabbed toolbar is suggested as a way to add functionality without cluttering the interface and avoiding the use of pop-up dialog boxes. A hypothetical port to the GUI in LyX is used as an example.
Objective To investigate the incidence and risk factors for small-bowel obstruction (SBO) after certain surgical procedures. Design A population-based retrospective register study. Setting Small-bowel obstruction causes considerable patient suffering. Risk factors for SBO have been identified, but the effect of surgical technique (open vs laparoscopic) on the incidence of SBO has not been fully elucidated. Patients The Inpatient Register held by the Swedish National Board of Health and Welfare was used. The hospital discharge diagnoses and registered performed surgical procedures identified data for cholecystectomy, hysterectomy, salpingo-oophorectomy, bowel resection, anterior resection, abdominoperineal resection, rectopexy, appendectomy, and bariatric surgery performed from January 1, 2002, through December 31, 2004. Data on demographic characteristics, comorbidity, previous abdominal surgery, and death were collected. Main Outcome Measures Episodes of hospital stay and surgery for SBO within 5 years after the index surgery. Results A total of 108 141 patients were included. The incidence of SBO ranged from 0.4% to 13.9%. Multivariate analysis revealed age, previous surgery, comorbidity, and surgical technique to be risk factors for SBO. Laparoscopy exceeded other risk factors in reduction of the risk of SBO for most of the surgical procedures.
Conclusions Open surgery seems to increase the risk of SBO at least 4 times compared with laparoscopy for most of the abdominal surgical procedures studied. Other factors such as age, previous abdominal surgery, and comorbidity are also of importance
We have now started testing a telehealth system for stroke rehabilitation in a rural area in Sweden (NU- Hospital Group Area). For collection of assessments and audiovisual communication, the telehealth system has bidirectional contact with the home-based units. To date, three stroke subjects’ participated; they were instructed to play 3D computer games with the hemiplegic upper extremity. The intervention led to clinical changes for all subjects. The analysis of the audiovisual communication revealed that the both stroke subjects and therapists were not yet effective in regulating their turn taking process. The data suggests the feasibility of a distance based approach using 3D virtual environments for upper extremity rehabilitation after stroke.
This paper describes an experiment of using a variant of the TSP (Traveling Salesman Problem) of ACO (Ant Colony Optimization) and automatic fitness in the evolutionary algorithm to create jazz improvisation solos. It is a sub-project of our overall EJI (Evolutionary Jazz Improvisation) project, where we try to explore the inner nature of jazz music and model jazz improvisation solos and jazz harmony in the computer by means of evolutionary algorithms, swarm theory, chaos theory, neural networks, memetics and other kinds of heuristics
This study presents results of a survey about social network website (SNW) usage that was administered to university students in China, Egypt, France, Israel, India, Korea, Macao, Sweden, Thailand, Turkey, and the United States. The offline and online social ties of SNW users were examined by nationality, levels of individualism-collectivism (I-C), gender, SNW usage, age, and access location. Contrary to existing literature, we found no differences in the number of offline friends between individualist and collectivist nations. Similarly, there was not a difference in the number of online social ties between individualist and collectivist nations. However, members of collectivist nations had significantly more online social ties never met in person. Heavy SNW users in individualist nations maintained significantly higher numbers of offline social ties; however, heavy SNW users in collectivist nations did not have higher numbers of offline social ties. Related implications and recommendations are provided.
The County Administrative Board, a governmental authority in The West Region of Sweden, has in collaboration with the local Counties asked us for advice in presenting information in an official and simple way on the web for the foreign person or refugee, just arrived in Sweden. Administrator officers and social workers are taking part in their introduction. In this paper, we describe our investigation and study in how to present a web design for a "newcomer" in Sweden through a multi-Layered design concept, a kind of design which will interact with each person's needs. In this multipurpose concept and portal, it has been important to consider aesthetical values, layout and graphic design. It has also been important to design a layout that provides different users with different contents. Copyright © 2010 ACM.
The study presented in this paper highlights an important issue that was subject for discussionsand research about a decade ago and now have gained new interest with the current advances ofgrid computing and desktop grids. New techniques are being invented on how to utilize desktopcomputers for computational tasks but no other study, to our knowledge, has explored theavailability of the said resources. The general assumption has been that there are resources andthat they are available. The study is based on a survey on the availability of resources in anordinary o±ce environment. The aim of the study was to determine if there are truly usableunder-utilized networked desktop computers available for non-desktop tasks during the off-hours. We found that in more than 96% of the cases the computers in the current investigationwas available for the formation of part-time (night and weekend) computer clusters. Finally wecompare the performance of a full time and a metamorphosic cluster, based on one hypotheticallinear scalable application and a real world welding simulation.
The disproportion between processor and memory bus capacities has increased constantly during the last decades. With the introduction of multi-core processors the memory bus capacity is divided between the simultaneously executing processes (cores). The memory bus capacity directly affects the number of applications that can be executed simultaneously at its full potential. Thus, against this backdrop it becomes important to estimate how the limitation of the memory bus effects the applications performance. Towards this end we introduce a method and a tool for experimental estimation of an applications memory requirement as well as the impact of sharing the memory bus has on the execution times. The tool enables black-box approximate profiling of an applications memory bus usage during execution. It executes entirely in user-space and does not require access to the application code, only the binary.
When most commercial clusters had one processor core each, decreasing the runtime meant executing the application over more nodes – the associated cost (in $) would scale linearly with the number of nodes. However with the recent advances of multi-core processors the execution time can be increased by utilizing more nodes or by utilizing more cores in the same nodes. In the industrial cluster environments a key question is how to run the applications, to minimize the total cost while maximizing the throughput and solution times of the individual jobs. The number of core used and their contribution to the total runtime reduction is especially interesting since companies often use commercial software that is licensed per year and process. The annual license cost of one single process is often far greater than that of a complete cluster node including maintenance and power. In this paper we present a metric for the calculation of the optimal way to run an application on a cluster consisting of multi-core nodes in order to minimize the cost of executing the said job.
The result of this bachelor thesis is a comparison between three different network devices on how many resources that is used on them when utilizing SNMPv1 and SNMPv2c polls and traps. The devices tested are an old Cisco router, a modern Juniper gateway and a Linux server. The experiments conducted prove that SNMP does not utilize the network devices resources to a point that it becomes an issue for the performance. These tests are done to ensure that SNMP do not use up to many resources on the infrastructure which would decrease the functionality and performance of the network. This study shows whether or not SNMP monitoring is a problem for the enterprise network
Context:Successful software development and management depends not only on the technologies, methods and processes employed but also on the judgments and decisions of the humans involved. These, in turn, are affected by the basic views and attitudes of the individual engineers.Objective:The objective of this paper is to establish if these views and attitudes can be linked to the personalities of software engineers.Methods:We summarize the literature on personality and software engineering and then describe an empirical study on 47 professional engineers in ten different Swedish software development companies. The study evaluated the personalities of these engineers via the IPIP 50-item five-factor personality test and prompted them on their attitudes towards and basic views on their professional activities.Results:We present extensive statistical analyses of their responses to show that there are multiple, significant associations between personality factors and software engineering attitudes. The tested individuals are more homogeneous in personality than a larger sample of individuals from the general population.Conclusion:Taken together, the methodology and personality test we propose and the associated statistical analyses can help find and quantify relations between complex factors in software engineering projects in both research and practice.
The concept of information is often taken for more or less granted in research about information systems. This paper introduce a model starting with Shannon and Weaver’s data transmission model and ends with knowledge transfer between individual persons. The model is in fact an enhanced communication model giving a framework for discussing problems in the communication process. A specific feature of the model is the aim for providing design guidelines in designing the communication process. The article ends with identifying a need for develop the model further to incorporate also communication within and between organisations of different kinds.
The concept of information is often taken for more or less granted in research about information systems. This paper introduces a model starting with Shannon and Weaver data transmission model and ends with knowledge transfer between individual persons. The model is in fact an enhanced communication model giving a framework for discussing problems in the communication process. A specific feature of the model is the aim for providing design guidelines in designing the communication process. The article ends with identifying a need for develop the model further to incorporate also communication within and between organisations of different kinds.
The concept of information is often taken for more or less granted in research about information systems. This paper introduce a model starting with Shannon and Weaver´s data transmission model and ends with knowledge transfer between individual persons. The model is in fact an enhanced communication model giving a framework for discussing problems in the communication process. A specific feature of the model is the aim for providing design guidelines i designing the communication process. The article ends with identifying a need for develop the model further to incorporate also communication within and between organisations of different kinds.
Systems development seems to be taught in a very traditional way in the Swedish unversities. It is supposed that an inhouse development starting from scratch will be at hand. This is shown using an investigation of current books in the area, of contemporary educations in systems deveopment. But the needs from the business word are different and this is shown in an investigation of job advertisments in Sweden. The conclusion is that informatics as subject is at a deep crise: We educate students for a work that was at hand in the 80´s and not for the 21st century!
A contact service in a municipality is a place where the citizens can apply for processing of their claims concerning municipal jurisdiction. Examples could be application for a place at pre-school, planning permission or change of dustbin etc. The clerks at the contact centre should be able to provide immediate service in most of the matters. This requires the work-process for each matter to be known. Before starting of a contact service this knowledge existed in the administration for the actual claim. In many cases it was tacit and not described. This paper discusses the problem of making this knowledge explicit and described in order to be used at the contact service. Issues concerning work organisation, personnel and job satisfaction are recognised, but not in focus. Instead our focus lies on the work content, processing of the claims, which the clerks are dealing with. It is a qualitative study, based upon three existing contact services and one, which is in the design phase. We start with a brief discussion of different types of knowledge, related to classical epistemologies within the organisation area (Nonaka & Takeuchi, Brown & Duguid, Cook & Brown, Polyani, etc). Based upon empirical material from the cases we identify some typical knowledge categories. It might be general knowledge about rules, procedures and such things; it might be experience-based knowledge from previous claims, typical claims and work praxis developed over time. It might also be knowledge about the specific citizen and about the specific application. But it can also be totally new categories. Two categories we are pretty sure to identify are matter-oriented knowledge, concerning the actual matter and procedural knowledge, concerning the processing of the matter-oriented knowledge. In our previous research about work-flow four levels have been identified and we suspect the same basic reasoning might apply here.
When Internet in the middle of the 1990s made its breakthrough a revolution occurred compared to the industrial revolution. Suddenly the cost for information transport was reduced to almost zero and genuinely new opportunities arose. Work, that can be performed by unskilled workers, are outsourced and the focus is on the business process. This requires a genuine new way of doing business; we see a need for trust, loyalty, and sharing of values. Education of users at the workplace will be a major concern and a common language and a mutual and deep understanding of the concepts and social contexts used is a prerequisite. A 3D apple model for context is described. For defining the social context, a user centred approach must be used. We need genuinely new informatics paradigms adapted to the network economy. This requires a massive re-education of all workers, both white and blue collar. To sum it all up: Reliable and sustainable production, availability of reliable information, trust, and flexibility are the means for us to survive in this new economy.
Establishing contact centres in municipalities is a contemporary issue. Many municipalities started establishing contact centres as the municipalities face towards the environment. A problem often neglected is the integration of the contact centre with the other administrations of the municipality. Our focus lies on designing common processes in the municipality for ensuring an information flow between the contact centre and the administrations. Derived from the empirical data, the authors present a model with the purpose to establish processes, which might reduce conflicts between the contact centre and the administrations. A focus is on the use of naming and tagging matters. The model is based upon the use of ontologies of the clerks working with a matter in a workflow. One characteristic of this model is that the ontologies are developed in close cooperation with the clerks.
The standard of reporting in diagnostic studies has generally been low. Fortunately, this issue has begun to be addressed in recent years through the discussion of important methodological issues in educational series, textbooks, and checklists. Double-blind, placebo-controlled, oral food challenges (DBPCFC) are considered to be the gold standard for diagnosis of food allergy. However, there is no consensus regarding how to interpret the outcome and how to define positive and negative provocations in DBPCFC. Furthermore, since most theories on the diagnosis of food allergy rely on the assumption that the DBPCFC has a high accuracy, this accuracy must be formally statistically evaluated. In this review, we discuss essential methodological issues for diagnostic accuracy studies in general and for oral food challenges in particular and discuss the importance of methodological issues as a guide for forthcoming studies of diagnostic procedures.
The standard of reporting in diagnostic studies has generally been low. Fortunately, this issue has begun to be addressed in recent years through the discussion of important methodological issues in educational series, textbooks, and checklists. Double-blind, placebocontrolled, oral food challenges (DBPCFC) are considered to be the gold standard for diagnosis of food allergy. However, there is no consensus regarding how to interpret the outcome and how to defi ne positive and negative provocations in DBPCFC. Furthermore, since most theories on the diagnosis of food allergy rely on the assumption that the DBPCFC has a high accuracy, this accuracy must be formally statistically evaluated. In this review, we discuss essential methodological issues for diagnostic accuracy studies in general and for oral food challenges in particular and discuss the importance of methodological issues as a guide for forthcoming studies of diagnostic procedures.