User:Jfrench
(→Stage 1 - Part C: Source Code Management/Control Activity) |
(→Stage 1 - Part C: FOSS in Courses Planning 2) |
||
Line 338: | Line 338: | ||
== Stage 1 - Part C: FOSS in Courses Planning 2 == | == Stage 1 - Part C: FOSS in Courses Planning 2 == | ||
− | + | '''Recalling your list of activities/topics from the "FOSS in Courses Planning 1" activity, identify the ways that these FOSS activities/topics can be structured'''. | |
+ | The only way I see this working well is by including it as a project - and maybe not as part of the regular curriculum I am teaching now, but some extra project for student research activities. | ||
+ | |||
+ | List the revised activities on your wiki page. | ||
+ | |||
+ | [[TOPIC 1 - Documentation]] | ||
+ | '''Identify some possible learning outcomes that should be fulfilled with the activities/task.''' | ||
+ | (1) Compare documentation with functionality; (2) Write clear and precise instructions; (3) Create images that enhance the written instructions | ||
+ | |||
+ | '''Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list.''' | ||
+ | Use of screen-shot and photo editing software. Use of word processing software. | ||
+ | |||
+ | '''Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule.''' | ||
+ | The instructor prep time is minimal but the instructor needs to be able to check and be involved throughout. The activity can be done without synchronization. In addition, small parts can be completed and available instead of having to wait for an entire document. | ||
+ | |||
+ | '''Think about possible input required from the HFOSS community. How much input is required and what kind?''' | ||
+ | I think communicating the project is important, but there are holes in this area, so it's needed. If users use the documentation, fine. If not, fine. | ||
+ | |||
+ | '''If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness.''' | ||
+ | User documentation has been one of the issues throughout this entire process of the POSSE 2015-09 process. Some of the projects we have reviewed have shown that documentation is an area that is lacking. Long-time users not need documentation as much as new users, but new users will find it most valuable and helpful. | ||
+ | |||
+ | '''Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community?''' | ||
+ | The grading is not complex. Someone not working on the project should be able to easily follow the instructions in the documentation created as part of a project. Usability, clarity, and organization can be graded using user feedback. Some sections can be individual, but it is important for students to come together as a group to compare and check for consistency. They should also review the work of others to help ensure the quality. Also, a template should be followed. The community should accept the work, but some users might skip using the documentation. It should be especially useful to new users. It would be a bonus if active members of the community would review the work to provide suggestions. | ||
+ | |||
+ | '''List any questions or concerns that you have about the activity/task.''' | ||
+ | The documentation could be time-consuming and tedious. It must be broken up for a sense of accomplishment as well as to make sure students do not leave unfinished work if off the project, graduate, etc... Small units of work should address the concerns. I | ||
+ | |||
+ | '''List any stumbling blocks or barriers to carrying out the activity/task.''' | ||
+ | I need to see more detail about what should be covered and how it should be implemented. | ||
+ | |||
+ | |||
+ | '''TOPIC 2 Website Modifications''' | ||
+ | '''Identify some possible learning outcomes that should be fulfilled with the activities/task.''' | ||
+ | (1) Write clean, well-organized code; (2) Write clear and precise instructions; (3) Create images that enhance the written instructions; (4) Create a clickable navigation that is user-friendly | ||
+ | |||
+ | '''Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list.''' | ||
+ | Web programming skills. Image editing skills. | ||
+ | |||
+ | '''Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule.''' | ||
+ | The prep time would be similar and some activities similar, but there needs to be approval from the community and access to the site that needs editing. There needs to be a complete test site for development and testing prior to implementing the finished product. Also, the schedule of the HFOSS might change with new updates, etc... and these items would need to be coordinated as they happen. | ||
+ | |||
+ | '''Think about possible input required from the HFOSS community. How much input is required and what kind?''' | ||
+ | This cannot be done without the approval of the community and they would need to be informed of each step for review. | ||
+ | |||
+ | '''If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness.''' | ||
+ | The website for this HFOSS project is horrible. There are broken images everywhere. There is no doubt a working website would be useful to every user. | ||
+ | |||
+ | Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community? | ||
+ | Grading would be similar to any web project. Testing will be essential and the community needs to be involved in the review and final replacement of current pages. | ||
+ | |||
+ | List any questions or concerns that you have about the activity/task. | ||
+ | I need to review how to actually get this done (who to contact, the needs and prioritize what is possible). | ||
+ | |||
+ | List any stumbling blocks or barriers to carrying out the activity/task. | ||
+ | Nothing more than mentioned in the last question. I would just need to use students who are ready to go with development. |
Revision as of 02:04, 9 September 2015
Jean (Jeannie) H. French
Jean H. French, Ph.D. is an Associate Professor in the Department of Computing Sciences at Coastal Carolina University (CCU). CCU is a public, liberal arts institution minutes from Myrtle Beach, SC. CCU offers undergraduate, graduate, and Ph.D. programs to nearly 10,000 students. The Department of Computing Sciences offers Bachelor of Science degrees in Computer Science, Information Systems, and Information Technology.
Dr. French's main areas of teaching are in Web programming, multimedia, and data sciences. Classroom projects often involve real-world problems for students to solve for the university or industry. Dr. French is highly involved in academic assessment as chair of departmental assessment and ABET accreditation, and also as chair of assessment for all university academic programs. Dr. French also serves as the vice chair of Faculty Senate.
When not teaching and assessing, she is busy raising her two little girls (4 and 6) with her husband. Though she enjoys living in Myrtle Beach, she is one of six children from Boston, Mass. and enjoys heading home to visit family.
BEGIN STAGE 1 - PART A: Due August 6th ******************************************************************************
Stage 1 - Part A - Intro IRC Activity
Part 1 How do people interact? The interaction was polite and informal conversation with questions, answers, comments, 'at-a-boy confirmations, and some emoticons.
What is the pattern of communication? Is it linear or branched? Formal or informal? One-to-many, one-to-one or a mix? The pattern of communication is informal and branched with mostly one-to-many. For example, informal speech was mentioned in the previous question. The conversation was branched because in the middle of a related thread of comments, there were side comments like "Sorry, Firefox creashed" by Heidi. The conversation was mostly one-to-many because one might have a question, but the others chimed in with suggestions.
Are there any terms that seem to have special meaning? info, action, topic all seem to have special meanings and these terms were evaluated in a special way in the final PDF meeting notes.
What advantages might IRC have over other real-time communication methods (like Google Chat or Facebook Messenger?) Are there potential disadvantages? While I used the Firefox browser plug-in, it was not necessary. There were a number of ways to connect to the conversation. The disadvantage is that this is only text-based. Except for texting content, there was no additional way to convey information such as images, video, or sound.
Can you make any other observations? I noticed that it looked like if a user wanted to get the attention of a particular user, they would type the name of that user.
Bonus question: Why didn't Heidi and Darci's actions get picked up by the meetbot? I really don't see this. There were three #action notes in the conversation by users darci and heidi. All three were included in the meeting summary PDF 18:12:58 <darci> #action amber will try graphviz This was indicated in the summary at 1.c. 18:51:28 <heidi> #action Heidi look into the status of releases in the next week. This was indicated in the summary at 1.h. 19:06:06 <darci> #action Darci will find John some easy Python projects to work on. This was indicated in the summary at 2.a.
For Part 2: N/A
For Part 3: I connected to the Mousetrap IRC. There were no active conversations when I was logged in, but I reviewed a number of the logs. There do not appear to be many users in any of the logs and the most active user is stoney. During one meeting, users kevin-brown and heidie had little to report, but also not much feedback to user stoney. User stoney had over 20 entries (from 17:06:33 to 17:13:28) with no response from either of the other users and user stoney pointed out the "silence." During one of the logs, it appeared to be a meeting with only stoney and the log seemed not to be used for conversation, but as a record of individual activity. The act of ending the meetings seems to just be an end meeting command with no confirmation that the meeting is about to end. It appears if some minutes have gone by without activity that the meeting is simply ended without comments about the next meeting. I suspect there must be other ways the users are communicating???
Part 4 (optional): N/A
Stage 1 - Part A - Project Anatomy Activity
SUGAR LABS PROJECT
Summary of Contact Pages: None of the teams had contacts cross-listed (each team had unique members). All three teams used and IRC Channel. In addition, the Documentation Team and the Activity Team had a mailing list. Development & Documentation both are without coordinators.
Activity Team: The activity team is responsible for keeping track of all of the activities available. This includes finding and working with developers and other teams. They can't just have random activities from different people. Though they encourage developers, they provide resources and oversight for the organized development of Sugar.
Development Team: The development team is responsible for actually building the Sugar environment. This includes working to fix any problems in addition to adding new features that need to be added to Sugar.
Documentation Team: The Documentation Team is responsible for creating and updating the user manuals. They need to have different versions of the manuals in different languages. For example, they are working on a Spanish version for the Sugar FLOSS manual.
Tracker: There are three types of categories listed: defect, enhancement, and task. The "task" category is only listed once. A majority of the tickets are for defects. Ticket information includes what appears to be a unique number, a summary, status of the ticket (new, assigned, reopened, and accepted). The ticket also includes an owner username, the type (as previously described), a priority and whether the ticket has to do with a particular milestone (if not, it it left as unspecified).
Repository: It is local. After viewing the Webpage source code, it provided URLs such as git://git.sugarlabs.org/sugar-base/mainline.git. When I simplified and went to git.sugarlabs.org, it shows that they are using Gitorious that states "Gitorious is a great way of collaborating on distributed open source projects." When I went to http://www.gitorious.com/, you can see that there are managed server and local install installation options. Since the source code shows git.sugarlabs.org, it looks like they opted with installing it on their own servers under the sugarlabs.org domain.
Release Cycle: The roadmap is updates at the beginning of each release cycle. The release team is responsible for the release cycle that determines the updating of Development Team's Roadmap.
THE SAHANA EDEN PROJECT
Summary of groups: When comparing the teams, it wasn't directly clear who was helping to keep the various development pieces organized. There is a Coding Tasks list that is open for individuals to pick and choose what they want to work on. The information is not as consistent on this website. For example, the Designers page makes no mention of the communication methods used, where the developers had IRC and mailing list information readily available.
Tracker: The tracker page first shows categories and then you need to click on the categories to find the list of individual tickets. For example, there was a category titled "My Tickets" which would (I assume) list only the tickets related to me (had I had any tickets in the system).
Active Tickets: The Active Tickets page shows the list of active tickets by a unique number. There is a summary. In addition, there is the component related to the ticket (Web, CSS, GIS, etc), a version (most are categorized as trunk tickets), the priority, the owner, the status, and the date the ticket was created. Some rows of tickets are highlighted in blue.
Repository: It is local. The installation directions specifically states navigating to the localhost after installation.
Release Cycle: There appear to be three named projects and release numbers. Two don't have dates and one says it is four years late. There is a tracking graphic that shows what percentage the release is in terms of being completed (it also shows the tickets associated for further information on the percentage complete. The page also shows a list of different modules/activities that are included in each.
BEGIN STAGE 1 - PART B: Due August 14th *******************************************************************************
Stage 1 - Part B: FOSS Field Trip
PART 1: SourceForge
Topic: Video Editing
Categories: 2079
Languages: C++(79); Java(342); C(239); C#(202); Python(99); JavaScript(90); PHP(87); Visual Basic .NET(69); Delphi/Kylix(57); Assembly(29); Perl(25); Unix Shell(23); Lazarus(22); Matlab(20); Visual Basic(20)
Top 4 Programming Languages: C++, Java, C, C#
Meanings of Statues: According Koper-Capodistria (2013), The SourceForge statuses indicate levels of progress of software development using a numbering system from 1 - 7. In order of progress, the statuses are planning, alpha, beta, and stable. The statuses are upgraded with "improvements in completeness, consistency, testability, usability, and reliablitiy". Inactive projects are those not included in the other status designations as there is either no code available or a lack of developers to work on the code. Koper-Capodistria further states that SourceForge leaves it up to developers to determine whether the project is pre-Alpha vs Alpha or Production/Stable vs Mature as there are "only vague definitions" of the categories. Koper-Capodistria, S. (2013). Open source software: Quality verification. Proceedings of the 9th IFIP WG 2.13 International Conference (OSS 2013). Available from books.google.com.
Comparison of Categories: I compared "Planning" and "Mature" statuses. It was surprising that a number of projects in the "planning" stage had not been updated in years. There was one project that was registered in 2006, but was last updated in 2012. I don't quite know why it was abandoned and moved to Inactive. This makes me wonder what really qualifies for inactive. For the ones I checked in the planning stage, there were no downloads in the downloads "This Week" section. There were some with a "browse code" option, though. For the mature projects, the ones I checked had more recent (as in even recent weeks) updates. There were useful user reviews (up to 5-stars) and a download option. The descriptions were also more developed.
Projects Most Used: First, I'll note that even though I entered "video editing" in the search, the results were not limited to projects that actually edited video. Some were video players, others sound editors. I chose the sort by "most popular" to determine which were the most used. Some of these products had tens of thousands of downloads in a week (one had over 100,000 downloads).
Project in My Category: I chose the "DVDStyler" project in the video editing category. The link is http://sourceforge.net/projects/dvdstyler/
What does it do? The DVDStyler is a DVD authoring program that "makes possible for video enthusiasts to create professional-looking DVDs." It includes DVD menus, templates, and options to select format, and basic editing features.
Programming language? C++
Who likely to use? Users are those who want to create DVDs. The description says this and so do the user reviewer comments.
Most recent change? The most recent change was 6 days ago, so the developers are active. More specifically, the developers closed a bunch of tickets on August 5th that addressed some older problems that were fixed, so the dates are not quite accurate. The wiki doesn't have much information. There is a link to the project's external site, but it is marked as having malicious software by my security. There are also reviews that say the software itself has malware, so I'm not going to dig any deeper. If I were looking for this type of software I wouldn't download it.
Committers? I cannot find anyone other than user ntalex (Alex Theuring) on any of the documentation.
Would use? As mentioned before, this software looks dangerous, thus I wouldn't use it. If I can't even safely go to the website to read more, I'm definitely going to avoid it. I'm a regular user of SorceForge and I've never come across software that was so sketchy. I was surprised. One reviewer stated "@sourceforge - Really? You're going to allow this?" and I agree with that surprise and frustration.
Part 2 - OpenHub Programming Language: Java
Lines of Code: 1,886,909
USERS Locations: Indianapolis, IN, USA (Ben), Cape Town, South Africa (Simon Kelly), Ho Chi Minh-byen, Vietnam (Kim Anh Vo)
Number of languages: 15
Language with second-highest lines of code: JavaScript
Language with highest comment ratio: Java
CONTRIBUTORS
What is the average number of contributors in the last 12 months?
No July 2015 numbers, so used July 2014 through June 2015
July 18
August 10
September 9
October 16
November 12
December 16
January 17
February 12
March 16
April 18
May 11
June 7
Approximately 13 average users.
How long have the top three contributors been involved in the project? The top two contributors (dkayiwa and raff) have been with the project for over four years. The third top (wyclif) contributor has been with the project for almost two years.
Compute the 12-month average of commits.
No July 2015 numbers, so used July 2014 through June 2015
July 143
August 43
September 30
October 85
November 84
December 57
January 92
February 79
March 92
April 92
May 82
June 63
Average commits approximately 79 per month.
Comparison to MouseTrap:
Just a quick comparison. This project users only 5 languages versus the 15 that are used for OPENMRS. It has 250,455 lines of code versus 1,886,909. It has only 31 commits in the last 12 months. This project had a peat in commits in 2014, but very little activity since (single digits).
Stage 1 - Part B: Project Evaluation Activity
Walk through of an evaluation of the OpenMRS project
MISSION CRITICAL CRITERIA - VIABILITY
Size/Scale/Complexity - Though the video mentions a project with approximately 6 active contributors is a good number and that much more would be too complex to learn, I would give the top score of 3 for OpenMRS in this category even though OpenMRS has approximately twice the active contributors suggested. The reasons are that the developer documentation states there are "over 50 active projects." Since there is not one massive program written, it would seem that there would be some flexibility in finding a fit with so many projects available. In addition, the developer documentations says that "when looking at the development of code for the OpenMRS core software, you'll find lots of collaboration." With good collaboration, there will be a way to further understand the project and the needs and to get feedback. Finally, OpenMRS supports the use of mentors, which is a wonderful way to help guide new learners.
Activity - The number of commits, according to the video, that is reasonable is 10-30. OpenMRS has over 70 per month. Because this is quite large, I would suggest a score of 3. It is well over the reasonable number demonstrating a very active project. I'll note that the commits has significantly dropped since 2012 and there has been a slight drop in developers. I suspect during higher periods of activity, that the projects were numerous and actively being developed, but fewer resources are now needed for the more developed aspects of the project.
Community - Over the last year, the software has been downloaded between 2700 and 3500 times per month, which looks stable. When viewing the discussion boards, there are some concerns. While some posts are getting views, the ratio of views to responses is quite low. In one question, there were 480 views and only 9 replies. Another question is already 6 days old (old for a developer to be stuck) and there are no responses and 43 views. When reviewing the IRC logs, there are times when people log in and cannot find anyone to respond, but in general there is daily activity. There were some conversations that showed some confusion. Someone asked a question to a specific user and that user wanted to know "what was my last response?" so I think the IRC is not the best way to communicate. Based upon the download history, discussion activity, and IRC activity, I would score this category as a 2.
MISSION CRITICAL CRITERIA - APPROACHABILITY
The documentation for getting started was organized and clear. There was specific mention of various technologies available to get started and links for resources. I would categorize this as ideal. (Score of 3)
MISSION CRITICAL CRITERIA - SUITABILITY
Artifacts - Bug fixes: In the two categories of introductory issues, there are two lists. One list has 13 and the other has 39. Not all are bugs. Some are new ideas and others are additions to the project. The process for explaining how to "create an issue" is well-documented, but when it comes to responses and feedback for bugs, too much time goes by. In one post, over 12 days passed before there was a response. I score it as insufficient. (Score of 1).
Contributor Support - As previously mentioned, I don't think the support in the IRC and -especially- the discussion boards is timely enough. I think, regardless of all of the documentation provided, this area is far too lacking for a high score. I score it as insufficient. (Score of 1).
Because one of the mission critical sections was scored lower than a 2, no secondary criteria were evaluated.
Stage 1 - Part B: Blogging Activity
https://drjeanfrench.wordpress.com/
Stage 1 - Part B: FOSS in Courses Planning
Possible Activities My initial thought was that we would not be able to do this easily, but after reading the material, I see beyond contributing to code. This is very encouraging. We would need to start of small. These are some introductory activities I think we can accomplish: Glossary of technical terms Graphics Documentation Testing Interview a FOSS user and find out why they use FOSS, benefits/drawbacks, etc. Shadow a FOSS contributor
Activities for ClassItalic text
Documentation - I think helping to create user documentation would be a great way for students to contribute and really learn about the project. This would result in them really getting to know how the projects work.
working on Graphics would be a great project for my Multimedia class. This could even be done in conjunction with the Documentation. A highly-visual documentation (with graphics) could be created.
In your reading, did you find existing materials? If so, describe how would you modify them to fit your class? I went to http://teachingopensource.org/index.php/RIT/The_Course and it mentioned Documentation in Week 7, but when I scrolled down, the link was broken. I tried http://foss2serve.org/index.php/Learning_Activities, but the section on documentation wasn't really user documentation like I was thinking. It did have "document code with meaningful comments" which might be a great post-introductory activity for my class. This page http://zenit.senecac.on.ca/wiki/index.php/Main_Page was too crazy, so when I searched for "documentation" there were no results, which I thought was odd, but if there stuff is too cumbersome to navigate, it will not be good for students either. Moved on...
If you did not find existing materials, summarize the activity in a sentence or two. I didn't, but we can probably create documentation or improve on documentation by actually using the software in a comprehensive manner and creating organized, visually appealing (with graphic examples) documentation.
BEGIN STAGE 1 - PART C: Due September 9th *******************************************************************************
Stage 1 - Part C: Bug Tracker Activity
Define what each of the column names below indicate. Include the range of possible values for 2-7 below. Feel free to explore beyond the page to find more information. ID - Unique identifier for each ticket
Sev - Severity: This indicates how severe the problem is - from blocker ("application unusable") to trivial ("minor cosmetic issue"). You can also use this field to indicate whether a bug is an enhancement request. Options are normal, blocker, critical, major, normal, minor, enhancement.
Pri - Priority: The bug assignee uses this field to prioritize his or her bugs. It's a good idea not to change this on other people's bugs. I don't know the Options are unknown to me. I didn't see a list of options for this. Perhaps it is because I don't have any actual bugs to prioritize.
OS - Platform and OS: These indicate the computing environment where the bug was found. The list is long (just shy of 20 options) including AIX, IRIX, Linux, FreeBSD, Solaris, Mac OS, and Windows. It pre-selected Windows for my bug.
Product - Product and Component: Bugs are divided up by Product and Component, with a Product having one or more Components in it. For example, bugzilla.mozilla.org's "Bugzilla" Product is composed of several Components: Administration: Administration of a Bugzilla installation. Bugzilla-General: Anything that doesn't fit in the other components, or spans multiple components. Creating/Changing Bugs: Creating, changing, and viewing bugs. Documentation: The Bugzilla documentation, including The Bugzilla Guide. Email: Anything to do with email sent by Bugzilla. Installation: The installation process of Bugzilla. Query/Buglist: Anything to do with searching for bugs and viewing the buglists. Reporting/Charting: Getting reports from Bugzilla. User Accounts: Anything about managing a user account from the user's perspective. Saved queries, creating accounts, changing passwords, logging in, etc. User Interface: General issues having to do with the user interface cosmetics (not functionality) including cosmetic issues, HTML templates, etc. There are far too many products to list here. For example, a list of them can be found at https://bugzilla.gnome.org/enter_bug.cgi?classification=Core (for the core classification only).
Status - Status and Resolution: These define exactly what state the bug is in - from not even being confirmed as a bug, through to being fixed and the fix confirmed by Quality Assurance. The different possible values for Status and Resolution on your installation should be documented in the context-sensitive help for those items. When you create a status, there are no options but NEW. Others are ASSI, REOP, NEED, and UNCO. From looking at the life cycle of a bug at https://www.bugzilla.org/docs/2.16/html/how.html, UNCO is probably unconfirmed, ASSI is Assigned, NEED is I have no idea. Perhaps in need of being assigned. and REOP is Re-Open.
Resolution - See Status, but specific to a resolved bug. Options are fixed, duplicate, wontfix, worksforme, invalid, remind, and later. https://www.bugzilla.org/docs/2.16/html/how.html
Summary - A one-sentence summary of the problem. It is required when opening a new bug.
THE ABOVE DEFINITIONS WERE COPIED FROM 5.3 Anatomy of a Bug as listed below (unless otherwise stated or in addition to otherwise stated).
Describe how you discovered the definitions and how did you find the information from above (hint: the advanced search shows the options or the Reports link has a link)? I found the https://www.bugzilla.org/docs/4.4/en/html/bug_page.html page which is 5.3 Anatomy of a Bug. To find the ranges asked, I went ahead and started a new bug to see what the form fields were. NOTICE, I only selected a core issue, so other bugs will probably have some different options (like product), though some options (such as severity) will probably be the same.
Identify the order in which the bugs are initially displayed? The bugs are by Status, but not in alphabetical order. I assume there is some number associated with them that is used for ordering.
What is the meaning of the shading of some bug reports? I don't know. It was mentioned in the IRC meeting, but it would be cheating to put that answer here, so I'll leave it blank.
What is the meaning of the colors used when describing a bug (red, gray, black)? According to the style sheet, it styles bug rows according to severity. .bz_critical { color: red; } - this is a critical bug with red text .bz_enhancement { color: #666; background-color: white; } This is an enhancement with a white background and grey text. See https://bugzilla.gnome.org/skins/standard/buglist.css?1438076713
Select a bug that you think that you might be able to fix and look at it more closely (click on the bug number).
BUG 736815
Identify when the bug was submitted. 2014-09-17 16:04 UTC
Identify if there has been recent discussion about the bug? There are no comments after the original explanation.
Is the bug current? Though dated 2014, the status is still new.
Is the bug assigned? To whom? Yes. The bug is assigned to GNOME We maintainers.
Describe what you would need to do to fix the bug. It is basic HTML/CSS coding. The bug reporter is expressing issues with extra white space in the navigation. It is not uncommon to use a list to display navigation links. The styles will be set to display the list as 'inline' instead of one item on each line. (The user confirms this is the issue). The programmers need to just adjust the code so that it fixes the problem.
Repeat the previous step with a different kind of bug. BUG 81041 Identify when the bug was submitted. 22002-05-07 13:52 UTC Identify if there has been recent discussion about the bug? There is no recent discussion, but this bug was discussed on and off for four years. Is the bug current? The bug has been reopened. Is the bug assigned? To whom? Yes. The bug is assigned to Panel Maintainers. Describe what you would need to do to fix the bug. I have no idea. It is talking about the view of panels and mentions Sawfish and Metacity. I would guess it is a hidden/show option in HTML/CSS because they mention DOM methods.
Summary of bug activity for the last week. How many bug reports were opened in the last week? How many were closed? 321 open; 238 closed
What was the general trend last week? Were more bugs opened than closed or vice versa? More were opened.
Who were the top three bug closers? Why is this important to know? 1 Matthias Clasen 32 2 Michael Natterer 12 3 Bastien Nocera 12 They are the ones who are active and most successful at closing bugs.
Who were the top three bug reporters? Are these the same as the top three bug closes? What is the overlap in these two lists? There was a top four because the last two were tied. 1 Bastien Nocera 9 2 Christian Stadelmann 7 3 m.rick.mac@gmail.com 6 4 Jo 6
They are not the top three bug closers, so there is no overlap with the top three in the two lists.
Who are the top three contributors of patches? 1 Sebastian Dröge (slomo) 12 2 Debarshi Ray 10 3 Lionel Landwerlin 9
Who are the top three reviewers of patches? 1 Bastien Nocera 24 2 Sebastian Dröge (slomo) 22 3 Matthias Clasen 14
What is the overlap between these lists and the bug closers and bug reporters? Matthias and Bastien are also in the bug closer list top three. Bastien is also in the bug reporter list top three. What is the overlap between patch contributors and patch reviewers? Sebastian is in the two lists.
Generate Graphical Reports
What class were the majority of the bugs for braille? Normal.
What other reports can you generate? So many, depending on what you are looking for, but one that seemed important was maybe charting the same components by status. I chose all of the same except changed the vertical access by status and found, that for braille, there were 25 new reports, 3 assigned, and 1 needinfo.
Stage 1 - Part C: Source Code Management/Control Activity
Done.
Stage 1 - Part C: FOSS in Courses Planning 2
Recalling your list of activities/topics from the "FOSS in Courses Planning 1" activity, identify the ways that these FOSS activities/topics can be structured. The only way I see this working well is by including it as a project - and maybe not as part of the regular curriculum I am teaching now, but some extra project for student research activities.
List the revised activities on your wiki page.
TOPIC 1 - Documentation Identify some possible learning outcomes that should be fulfilled with the activities/task. (1) Compare documentation with functionality; (2) Write clear and precise instructions; (3) Create images that enhance the written instructions
Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list. Use of screen-shot and photo editing software. Use of word processing software.
Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule. The instructor prep time is minimal but the instructor needs to be able to check and be involved throughout. The activity can be done without synchronization. In addition, small parts can be completed and available instead of having to wait for an entire document.
Think about possible input required from the HFOSS community. How much input is required and what kind? I think communicating the project is important, but there are holes in this area, so it's needed. If users use the documentation, fine. If not, fine.
If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness. User documentation has been one of the issues throughout this entire process of the POSSE 2015-09 process. Some of the projects we have reviewed have shown that documentation is an area that is lacking. Long-time users not need documentation as much as new users, but new users will find it most valuable and helpful.
Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community? The grading is not complex. Someone not working on the project should be able to easily follow the instructions in the documentation created as part of a project. Usability, clarity, and organization can be graded using user feedback. Some sections can be individual, but it is important for students to come together as a group to compare and check for consistency. They should also review the work of others to help ensure the quality. Also, a template should be followed. The community should accept the work, but some users might skip using the documentation. It should be especially useful to new users. It would be a bonus if active members of the community would review the work to provide suggestions.
List any questions or concerns that you have about the activity/task. The documentation could be time-consuming and tedious. It must be broken up for a sense of accomplishment as well as to make sure students do not leave unfinished work if off the project, graduate, etc... Small units of work should address the concerns. I
List any stumbling blocks or barriers to carrying out the activity/task. I need to see more detail about what should be covered and how it should be implemented.
TOPIC 2 Website Modifications
Identify some possible learning outcomes that should be fulfilled with the activities/task.
(1) Write clean, well-organized code; (2) Write clear and precise instructions; (3) Create images that enhance the written instructions; (4) Create a clickable navigation that is user-friendly
Describe any pre-requisite knowledge needed to complete the activity. This does not need to be a complete list. Web programming skills. Image editing skills.
Estimate the time required for instructor prep, for student completion and elapsed calendar time. Are you going to have to synchronize your activity with the community or can the activity/topic be covered independent of the HFOSS community schedule. The prep time would be similar and some activities similar, but there needs to be approval from the community and access to the site that needs editing. There needs to be a complete test site for development and testing prior to implementing the finished product. Also, the schedule of the HFOSS might change with new updates, etc... and these items would need to be coordinated as they happen.
Think about possible input required from the HFOSS community. How much input is required and what kind? This cannot be done without the approval of the community and they would need to be informed of each step for review.
If the result of the activity is contributed back to the HFOSS project, describe the contribution and its usefulness. The website for this HFOSS project is horrible. There are broken images everywhere. There is no doubt a working website would be useful to every user.
Describe the assessment/grading approach - What will the basis for grading be? Will this be a team activity or individual? Is there a role for the HFOSS community in helping assess student work? For instance, must the work be committed or otherwise accepted by the community? Grading would be similar to any web project. Testing will be essential and the community needs to be involved in the review and final replacement of current pages.
List any questions or concerns that you have about the activity/task. I need to review how to actually get this done (who to contact, the needs and prioritize what is possible).
List any stumbling blocks or barriers to carrying out the activity/task. Nothing more than mentioned in the last question. I would just need to use students who are ready to go with development.