Thursday, July 22, 2010

Call for coherent, systematic field summaries

The overview

As a Ph.D. student, figuring out what has already been done by past researchers is by far the hardest problem I have come across. The current expectation and requirement is that every student will browse tens of conferences and journals going back tens of years. At the end of this long and tedious process, there are several common pitfalls. First, if even a single paper in a single conference was missed, this could lead to months of work which will eventually be found to be wasted time because it was spent working on an idea that was already developed. Second, even if every relevant paper was indeed located, students almost always discover what happened in each paper individually, and rarely draw links between papers. This is critically important, as described below.

The details

I recently attended the International Computer Vision Summer School (2010). There was one session in which students were asked to read 3 papers and trace the ideas in these papers back as far as possible in the literature in the form of a tree. After reviewing the submissions, the organizer was quick to point out that there was very little overlap in the trees submitted for these 3 papers. He went on to explain how the 3 papers had essentially presented exactly the same concept, just from slightly different angles and using different terminology and notation. Thus, the trees for the 3 papers should in fact be identical! This was an excellent concrete example of the problem - it takes an "already expert" to extract these deep and extremely important connections from a literature review. Leaving it up to every new student is not only a fruitless effort (as they will not get the correct information out anyway), it is also enormously replicated effort! There should be a system in place where efforts are pooled to do this completely and correctly a single time.


Previous attempts


There are occasionally "survey papers" written. These are very close to a good solution. However, they suffer from a lack of view points, as well as a lack of frequency. These two issues are directly addressed in the following section.


The proposal


There are two phases of my proposal for action. The first is the "catch up" phase, followed by the "maintenance phase".


"Phase 1: Catch Up"


The "catch up" phase is the longest and most difficult, but also the most helpful.
I propose that the general population of a field nominate and elect a committee of experts who are the most qualified, accomplished, and knowledgeable people in the given field. These people would be charged with two tasks. The first is splitting the field into an appropriate number of subfields. I can only speak of my field of Computer Vision. One could break this field down into "Structure from Motion", "Object Recognition", "SLAM", etc. There should be 3-5 experts in each of these sub-fields on the committee. Each sub-committee is then charged with producing a survey paper of the work in this area starting as far back as possible and going to the present year. This will certainly be a large document, with many, many references, however it is important not to get lost in the task of listing references. These connection between papers and following the evolution of each idea is the central idea of this whole project. The payment for this exercise is an overwhelming sense of advancing the state of the art of scientific research procedures, as well as a resume line item which indicates that you are a recognized expert.


"Phase 2: Maintenance"


This is the easy phase! This process must be performed yearly (or at some other regular interval). Again, a committee must be selected. However, all that must be done is a short review of what has happened in this sub-field in the last year. References should NOT, for the most part, come from more than 1 year ago. This keeps these reviews linear and sequential, making them extremely easy to follow.




Potential problems


After some initial conversations with some of the field experts, it is apparent that there is potential for some political issues to be raised with a project like this. You may get people complaining "Why is my paper not included in the survey!?". You may also expose parallels that the original authors did not realize, making them feel "foolish". It is my opinion that the progress of the field and rapid absorption of young researchers is much more important than protecting an individual from this type of silly whining.

Potential benefits

If students could read a couple of these documents and be fully caught up on the state of the art of their sub-field, many new doors would be opened. First, people would not be so restricted to a single sub-field. It would be possible to keep current very easily in multiple sub-fields by simple reading through these documents when they are released yearly. I have seen many times where a solution from an outside field has been adapted to a problem in the field with amazing results. Second, students could move forward confident in the fact that their work is actually on a track that the field is interested. They could also be certain that their work has not been previously attempted. The time savings when multiplied by the number of students is incredible. By applying the correct resources (the experts) in the correct places (a directed effort of these systematic summaries), a much more efficient community can certainly be achieved.

Conference Summary Committees

At every professional conference, hundreds of papers are presented. This can quickly become quite overwhelming. For people in attendance, the game plan seems to be to scan the list of titles in the conference schedule to see which posters and talks seem most interesting and/or most applicable to the individuals research objectives. To be sure, a major goal of conference going is to network and make new contacts. However, there should be another major goal which is often talked about but overlooked for the most part. That goal is keeping current with ideas and discoveries in fields related but not exactly in your research area. This is nearly impossible by simply walking around and looking at posters.

Enter the solution. At each conference, a panel should either be appointed or elected. This panel should consist of leading experts in many or most of the sub-fields represented at the conference. These experts should have a discussion at the end of the conference to decide what the serious contributions were at this conference. It is no big secret that the majority of papers submitted to a conference are incremental improvements on existing methods with mildly better results. However it is quite tough to pick out these "serious" papers without a solid background in the sub-field that they came from. Therefore, it should be up to this proposed panel to construct a short document (<5 pages or so) "summarizing" the contributions of the conference. This would allow not only conference attendees to receive the "take home messages" at the end of the conference, but also for people who were not able to attend the conference to have the big picture idea of what they missed. Handing a colleague a DVD with 400 abstract and papers and saying "here are the proceedings" is almost certain to invoke the same exercise of scanning titles and reading only papers relevant to his current research. If, instead, one could hand a colleague a 5 page document and say "this is what happened at the conference", the entire field would stay much more informed and up to date.

Students are not being prepared for industry

I can only speak about my field (computer vision and image processing), but I imagine the situation is similar across the board. What we learn in college are "the fundamentals" - the theoretical (often too much so) ideas of many topics. We are seldom asked to implement these ideas in software. When we are, it is done with absolutely no consideration of the process - that is, you can use which ever language you want, which ever method you like for revision control (including none!), work by yourself or in a group of your size and choosing, and the list goes on. The only thing that is important is the result. When you get to an industrial setting, exactly the opposite is true. Working on a team of programmers is critical. You must understand how to share responsibilities, ideas, and code. These are the most important skills for success in any real setting, and they are rarely exercised - and definitely not taught - in college.

After a recent interaction with the hiring manager for GE Global Research Center, I have learned that they actually plan for at least an entire year of negative productivity from new students. That is, new hires are an investment. They hire a new student with the intention of training them for at least a year before they start adding value to the company. It seems to me like this transition should be much much smoother. It should (clearly?!) be part of the responsibility of post-secondary education institutions to prepare students for their next life role as an employee.

Tuesday, June 1, 2010

Stop letting students play online during class!


Walk into almost any college classroom and you will almost certainly see at least a handful of students staring at a laptop screen, clearly not at all paying attention to the instructor. In this discussion, let's ignore the underlying reasons that the students are not paying attention (most times it is the lecturers fault that they are not engaging the students properly). No matter whose fault it is, fooling around on the internet during class is rude, unacceptable, and should (and can!) be stopped!

I am not suggesting laptops be banned from classrooms. In fact, they can be extremely valuable tools for taking notes, communication, demonstration, and much more. However, it is rather obvious when a student is doing something other than something class related. It is very simple - with a quick glance around the room during a pause in speech, those students whose eyes shift their attention to the lecturer to see what the pause is for are paying attention; those students whose eyes remain glued to the screen are not! An even more effective way of determining this is by walking a lap around the room (though this may not be feasible in a large lecture hall). It pains me to see faculty blindly continuing a lecture when many students are clearly off playing in “internet land”. Use your authority and tell them to put away the laptops!

I am actually NOT an advocate of required class attendance, but if a student does come to class, it is extremely rude to not pay attention, and even worse, become a distraction to other students who did come with the intent of paying attention.

As Monica Bulger mentions here: http://monicabulger.com/2010/04/banning-laptops-doesnt-solve-the-distraction-problem/
laptops are certainly not the only source of distraction in a classroom. However, actively engaging in non-class related activities on a laptop is an obvious distraction and one that can be easily prevented with a simple "Hey! Put away the laptops!".

Wednesday, May 26, 2010

Granularity of Grading Scales

One thing that has always bothered me is the granularity of grading scales. The point of the exercise of giving a grade should be an attempt to classify how well the student learned the material. It seems reasonable to classify this level of understanding into “not at all”, “not very much”, “ok”, “pretty well”, and “excellent”, which seem to correspond to the typical F,D,C,B,A. What does NOT make sense is to assign a number on a scale of 0-100. A grade of 67 seems to indicate somehow that 67 percent of the material was learned. This is, however, not at all the case. Rather, it means that the student answered 67% of the particular questions posed on this assignment correctly. It is extremely rare for faculty to ask exactly the right set of questions to determine if every concept was learned in a reasonable way, so this number seems just about meaningless. I have always been in favor of oral exams. I find it extremely easy to, within a 5 minute conversation with a student, classify their understanding of what they should have learned into one of the five categories described above. I guess at some level you’d have to buy into my “Teach the Why not the How” concept (http://daviddoria.blogspot.com/2010/05/teach-why-not-how.html) to realize that it doesn’t really matter if a student is able to produce the correct numerical value on an exam question, but it is EXTREMELY CRUCIAL that they understand the general “what is going on”. I understand that in large classes they are not reasonable, and this would certainly need to be addressed in order to implement such a system on a large scale.

Monday, May 24, 2010

Why does no one care that professors aren't trained as instructors?

It has always seemed extremely odd and unacceptable to me that faculty members of most universities, while being experts in their areas of research, have not received even a single hour of training on how to be an effective educator. While such expert status may make someone the best person to teach a very specialized topics course on their research interests, how are they the most qualified people to teach introductory level, or even advanced undergraduate courses? One could argue that a student who has completed the course is equally qualified to teach it as the professor. Though the professor may know many more advanced topics, these rarely help in explaining basic principles; in fact, it may make them more convoluted.

Many parents are so interested in having these unqualified instructors that some universities have instituted policies to prevent graduate students from teaching classes. A graduate student who is interested in teaching would serve as a much better instructor than a "distinguished" faculty member who learned the material over 40 years ago, hasn't changed his teaching style to keep up with modern trends in education research and learning style evolution, and is frankly uninterested in teaching at all in this point in his career. It is unfortunate that parents don't consider these facts when deciding, especially so passionately, who should be teaching their student.

For any other job, training is an intensely integral part of the job. Pilots of airplanes must log thousands and thousands of hours before they are allowed to fly. There are even federal regulations to ensure that every airplane pilot is not only trained appropriately, but also can demonstrate that his training has resulted in him being an excellent pilot. However, for arguably the most important job, educating the next generation of the world, no one blinks an eye at the zero hours of training logged by the pilots of the classrooms.

My recommendation is a "basic training" for faculty. When any university hires a new faculty member, they should be required to attend a several week training program by a nationally standardized group. It is the responsibility of this group to have thorough training, practice, and examination programs in place to ensure new faculty are going to be effective educators. This initial training would be a massive step forward, but it cannot end here! Almost no certifications are a "get it once and have it for life" type of deal. Every 5 years, the faculty should be required to attend a 1-2 week "re-certification". The faculty training staff would be provided with the course reviews that the faculty member has gathered in the time since the last training session, and they would work together to address any issues that may have arisen, as well as introduce new technologies and methods that can (and usually should) be incorporated into the instruction.

It is a well known fact that there is much griping among students such as "All of my classes are terrible!" and "The professor just rambles at the blackboard!". With this type of faculty training system in place, there should be no way for these complaints to continue, making students happier, smarter, and much better engineers of tomorrow.

Thursday, May 20, 2010

Teach the "Why", Not the "How"

Let us consider a college Calculus course. The timeline of the course goes approximately like this:

Week 1 - What is a derivative?
Week 2-10 - Practice manually computing derivatives of hundreds of functions.
Week 11 - What is an integral?
Week 12-20 - Learn many methods for performing integration manually and practice this on hundreds of complex functions.

They have the missed the point completely. With todays technology, on a device as simple as a handheld calculator, one can find a derivative (10 weeks of practicing) in a single line:

diff(f(x),x)

The same goes for integration:

int(f(x),x)

What is it, then, that we humans are needed for at all? The answer is three fold

1) To be able to produce an f(x) out of a real life problem

2) To know when to type one of those lines into the calculator or computer

3) Understand how to interpret the results

Consider a typical, simplified workflow in an industrial application. You work at a marble factor. Your boss says "Dave, we need you to make a box out of this cardboard square so that we can ship the most marbles." You go back to your office and draw a sketch of an unfolded box. You come up with the equation of the volume of the resulting box in terms of a couple of parameters. You realize that what you need to do is find the maximum of this volume function. You know that by equating the derivative of a function to zero you will find the extrema! You then type

solve(diff(f(x),x)=0,x)

into the nearest computing device. It tells you that there are two solutions. You realize that one is probably a minimum and one is probably the maximum you are looking for. Another couple of lines to verify this:

v(solution1)
v(solution2)

You pick the solution that produces the highest volume. You are done! You report to you boss that you know how to make the biggest box out of the cardboard blank and he is happy with your work.

Here is a rough breakdown of the knowledge required to solve this problem:

50% - Setting up the volume function
40% - Knowing that the zeros of the derivatives are the extrema
10% - Interpret what the roots mean

Note that 0% was reserved for knowing HOW to solve for the roots. Computers can do this :)

Your boss may ask you "How did you do it?". To this you can say "I developed an equation for the volume of the box with a couple of parameters that you can see in this sketch. I then maximized this function to determine the appropritae parameters." You will NOT have to answer the question "How did you solve that equation?". This is because not only is it unimportant, it is also assumed that you have used all of the tools at your disposal to prevent human error.

Let's revisit the Calculus class. How much time did they spend teaching you how to develop functions from practical situations? Almost none. How much is this necessary? VERY! How much time did they spend teaching you HOW to solve the problems? Almost 10 weeks. How much is this necessary? NONE!

The solution here is to shift the instruction away from the "how" and focus on the "why".

A skilled team of mathemeticians and computer scientists has figured out how to handle almost any operation on almost any function you can develop. If you are not on that team, you do not need to know what happens inside the box, you only have to be intellegent about what you feed to the box, and you have to know what to do with the information the box gives you back. THIS is engineering.

Certainly we do not want these mechanics/details to become a lost art. By not teaching every student these details, we are not losing the art. There are hundreds of text books published with the mechanics/methods for these operations. If one ever does need to know the details (perhaps they've joined the team to write the next software package!), any of these books can be consulted.

I am ABSOLUTELY NOT condoning the typical malpractice of students just typing equations into a calculator without knowing what they mean. In fact, many teachers have banned calculators in classrooms for this reason. Unfortunately, these teachers have missed the point. If their questions can be answered by the calculator alone, they are asking the WRONG QUESTIONS! As described above, the need to focus almost entirely on what the calculator CANNOT do, and explain that it is OK to defer the parts that the calculator CAN do to the calculator! A math teacher may tell you "No, no, look at all of these real life problems the students are doing for homework!". Look more closely. These "real life" problems are typically simply a mask over a mundane "do the computation" type of problem. Rather than just renaming variables, the empasis needs to be on "Look how many exciting things you can do now that you know these concepts!".

We have to be careful, though, to not say "Here is the theory, now you are done." This is very bad. Theory without practice is reserved for scientists and mathemeticians! But we are engineers! We must say instead "Here is the theory, now we're going to discuss and give you many examples of typical use cases". This is what will put students on the right track to becoming excellent, problem solving engineers.

Monday, May 17, 2010

Problem Abstraction: Helping others help you & promoting the archive-ability of your answers

I find all too often on mailing lists and forums that people are so concerned with getting the answer to their question that they forget that no one else has anywhere near the same view of the problem as they do.

Asking too specific of a question is extremely harmful to all parties. For the person asking the question, it is very likely readers will skip their question entirely because they don't want to spend time understanding what is going on with an enormous software system in order to answer one small question. Assume that someone DOES take the time to figure out what is going on answer the question. This has been an enormous waste of resources, because the only person who will benefit from this answer is the original poster. Future readers will certainly never find this post in a search.

"Abstracting the problem" is a technique which benefits everyone involved. First, the person asking the question is force to thoroughly think through the problem, often coming to the solution themselves along the way! Second, the readers immediately know what is going on as the question is phrased using language already familiar to them. Third, future readers, again using the same language, can easily find the question and answer pair. This not only is more efficient for the person performing this future search, but also prevents the same question from being asked yet again, and the whole cycle repeating itself unnecessarily.

Let's take a simple example:

If the question reads:

I am making a first person shooter game. My characters get to jump around and shoot each other. In the menu that comes up when you press escape, under where it has their score I want to be able to have the person change their players name, but it doesn't let me!

Here is my game:

[insert OpenGL c++ mess here]

Please help!!


Most readers response will be "what is this guy talking about?" and skip to the next question.

If, instead, the question reads

I have made a simple example of my problem. I have a class and I want to set one of its variables, but it is telling me "Name is private". What does this mean? Here is the simplest piece of code I could get to demonstrate the problem I am having:

class Player
{
std::string Name;
};

int main()
{
Player MyPlayer;
MyPlayer.Name = "David";
return 0;
}

It would take any user with some c++ experience an extremely short amount of time to recognize that the poster is indeed trying to set a private member directly, which is not allowed. Future users with the same problem could potentially find this post by searching for "is private" or "member variable" or any number of other very reasonable search phrases. The only expense to this massive efficiency gain is that the person asking the question has to do a bit of thinking to "abstract the problem" and provide the simplest compilable example for the community to look at.

A simple template to follow to help ensure the question is asked in a reasonable fashion is:

1) Brief description of the problem / errors
2) Current output
3) Expected output

The skill of abstraction is invaluable and can only be learned through practice, but is definitely well worth it for the benefit of entire online communities.

Friday, May 14, 2010

Creating a Common Research Language

As a research engineer, the sharing and dissemination of ideas throughout the field is absolutely critical. These ideas are tightly coupled to their implementations. As a explanatory example, consider a very simple invention, the pencil. Once can explain the concept of a pencil in a single sentence : ``A pencil is a writing implement usually constructed of a narrow, solid pigment core inside a protective casing.'' Now, consider that you have an incremental improvement to the pencil - you want to make it one color for the first half of the pigment, and a second color for the second half! It is a very simple new concept. If you were the creator of the original pencil, this small change would be extremely fast and easy to produce - simply use two different pigments in your pencil core production process. If you are NOT the creator of the original pencil, this is now an extremely complex task - you must address questions like ``How do I compact graphite?'', ``How do I get the core to not slip inside of the casing?'', and ``What kind of wood do I use for the casing?''. These questions have already been addressed, and likely studied in detail by the original creator of the pencil. That is of no use to you, though. Unless you are personal friends with this person, it is not likely that he will let you use his pencil factory to try out your new idea, so you must start from scratch.

This contrived example of an improvement to the pencil is shockingly similar to the daily situations encountered by engineering researchers. The major difference is that in software there is a simple solution! The situation is all too frequent: someone develops an algorithm and spends years perfecting its implementation. Now I want to use that algorithm as a step in my research. The path rarely strays from: 1) look online and find that there is no publicly available implementation of the algorithm published by the author. 2) email the author asking them to share their implementation with you. 3a) The author has agreed! Now you realize the code has been written without a single regard for future users - there are no comments and no standardized workflow, basically rendering the code useless to anyone but the original author. 3b) More commonly, the author will not respond, or will give you a ``sorry, I can't share that code'' type of response. Either way, you must now move forward with nothing. 4) Decide whether to take your research in a different direction, or stay and fight by implementing the alrogithm yourself. 5) spend countless hours and days fighting with the nuances that were left out of the publication of the algorithm, all of which the author has surely already addressed in their implementation. (Note that step 5 is where the majority of graduate students time is spent - REdoing past work!).

Having experienced the above path on countless occasions, I have seen three major problems that arise when implementations are not shared:

1) Massive time expense for newcomers to a field.

Consider that you are a new PhD student. There are two options; 1) No students in your lab have worked in the area you plan to work. In this case, you must start absolutely from scratch. 2) Your research interests have been shared by previous students in your lab. The good news: you get to start with implementations of several important algorithms! The bad news: each algorithm has been implemented by a different student, in a different language. If you are lucky enough that the language is common, it certainly was not using the same libraries. If you are extremely lucky and they did use the same libraries, the code is likely written in a way that does not encourage reuse- either no comments or no reasonable API. You might as well be in case 1 :(

2) Clique-ish research groups.

In the very rare case a lab has multiple studedents working together on parts of a unified problem, the effects of the Pencil analogy are intensely amplified. After a few years of multiple people working together on a code base, incremental changes are extremely easy to pump out quickly. This leads to an even more intimidating barrier to a newcomer to the field. This, in turn, leads to a less diversified outlook on problems, as the same people are almost exclusive continually working on certain problems.

3) ``Bad'' re-implementations.

When it has been determined that the way to proceed is to re-implement an existing algorithm, many things start to go wrong. Besides spending massive amounts to time that should be spent elsewhere, one must also consider the quality of the re-implementation. The original researcher spent months or years of his time completely dedicated to this particular algorithm. You intend to simply use it as a small piece in a much larger puzzle. The implementation you create in a week will absolutely not compare to the original implementation in speed, correctness, and reusability. This leads to inaccurate comparisons in research results, as well as overall lower quality and speed of future research.

Scientific computing/programming languages are no different than oral languages. Several oral languages developed thousands of years ago, and regionally societies accepted a particular language so that they could all communicate with one another. This was likely all done unintentionally. Here, the people are the researchers, and the regions are the fields of research. If laypeople could come to these conclusions without even intentional consideration, why can researchers not come to the same conclusions in the face of a very serious problem?

My field is computer vision and scientific data analysis. The languages of choice are Matlab and c++. The libraries (sub-languages) of choice are CGAL, VTK, ITK, and VXL. This exposition is not intended to claim a particular solution (though many of you know which way I lean :) ). If one of these is chosen as the ``accepted'' language for this type of research, the field would be able to accelerate at an amazing rate.

If a researcher is able to say "I have implemented all of my research as VTK filters", the next researcher can simply use his work as a building block for the next year's research. If, instead, the researcher says ``I wrote all of my classes from scratch by myself 20 years ago and my students and I have used them ever since'', there is a serious problem.

Monday, May 10, 2010

ASEE Northeast Section Conference May 7-8 2010 Summary

I recently attended the Northeast Section ASEE conference in Boston, MA. While there were two full days of poster and oral presentations, there were a few that stood out as particularly profound to me. The following is a brief summary of the conference and these excellent presentations.

Theme

The conference theme was “Education in the Digital Age”. Many of the presentations addressed this theme. Some did so directly, through the introduction of new technologies and software. Others did so indirectly, by studying the behaviors of the students of the “digital age” and adapting teaching philosophies to better suit their radically different learning styles compared to students of the 20th century.

Keynote Address – Professor Woodie Flowers, MIT
Professor Flowers had some extremely interesting insights into the current state of engineering education. He presented the results of several studies that clearly show that in general, students are not being taught in college the things they are expected to know for their jobs post-graduation. One study surveyed a group of engineers who graduated 10-14 years ago. They were presented a list of 100 topics and asked to indicate whether they learned the topic in engineering school or on the job, if the topic is useful to them in their current role as a professional engineer, and if their employer expects them to be knowledgeable in the topic. The responses clearly indicated that almost all of the things they learned in engineering school were not used or expected to be known, and all of the things that are used and are expected to be known on the job were not taught to them in school. This seems like a very serious problem.

Professor Flowers also pointed out that society is expending significant effort in all the wrong places. He used a very interesting example – the recent film Avatar had a budget of $500M and over 1,000 technical experts staffed. After the initial hype, the long term contribution to society is essentially non-existent. On the other hand, a typical text book is the result of the work of 2-5 people and an extremely small budget. The text book, however, has an extreme impact on society, as it is used as part of the training of the engineers of the next half-century.

A main topic of the talk was the separation of “training” from “education.” He defines training as the mechanics of problem solving, learning terminology, etc. “Education” is the transfer of experience and insight from the instructor to the students. His idea is to “outsource the training to screens.” That is, the students should be responsible for learning the mechanics and the “dry” parts of course topics on their own. They can then spend the valuable in-class time in thoughtful discussion with the course professor. This outsourcing, of course, would need to be to more than simply a text book. This part of the plan calls for an extremely multidisciplinary team of educators. Not only “content experts” who currently write text books, but also graphic artists, linguists, education researchers, game designers (everyone who would be on the Avatar team) should work together to develop a highly interactive and exciting electronic training system that caters to all learning styles. The idea is to start working together. There are dozens of books which try to teach the same material that has been around for centuries. It is time that we work together to develop a single solution that is comprehensive and is more effective at educating our young engineers.

Professor Flowers also addressed the concern of how to challenge or change a system such as the current education system, which has such enormous momentum. His thought is to develop this new process in parallel, prove that it works, and then hope that it is slowly absorbed by the mainstream system. This seems like the only reasonable approach to me.

Learning an Integrated View of Engineering (LIVE)
The author has developed a web-based tool to formally connect topics throughout the ECE department at her university. The idea is that students enter “Course 1,” then leave “Course 1” and enter “Course 2.” If the instructors are excellent, they will make explicit the links between topics across course boundaries. If they are not, these links are not necessarily found by students on their own, and missing these links is a major loss. The tool is simply an interactive concept map that students can navigate around. The leaves of the concept tree are basic concepts (algorithm, signal, etc). These leaves feed up to slightly more complicated concepts, which eventually feed up into applications. Students can query things like “What can I do once I learn this?” and find some exciting answers, which will hopefully keep them motivated. Links to nodes at the same level can help students find parallels between concepts that aren’t necessarily sequential. These links can help them to look at topics from different angles, hopefully enhancing overall understanding.

The tool also overlays the institute curricula over the concept map. For example, one can go to the section of the map that has topics from ECE 101. I spoke to the author about sharing this tool. Since the concept map is essentially static, any institute could very easily overlay their own course numbers and concept organization onto the existing concept map. She is checking with her funding sources to see if sharing the system is possible, but if it is, I highly recommend RPI looks into adopting it.

Inverting the Lecture Paradigm
This was a recurring theme throughout several presentations. It seems to be now widely accepted that lecturing is no longer an effective way to communicate to today’s students. Hundreds of years ago, lectures developed because one person was the only source of the knowledge, and this one-to-many communication was a very good way to spread the information. Today, however, students have at their fingertips at home and literally everywhere they go (laptops, IPods, IPads, Kindle, etc.) the same information in the form of text books (printed and digital), recorded lectures, and other online tutorials and resources. We must recognize that it is not hard at all for students to obtain the information. What we must do now is teach them what to do with it and when to use it.

The current system is set up for students to hear the material in class in the form of a lecture, then go home and practice it in the form of homework problems. The proposal is to do exactly the reverse. Students should learn concepts and do problems before coming to class, then spend class time in discussion with the professor about when and where to use these new tools. You may remember an almost identical concept from the summary of the keynote address, where the “training” was to be separated from the “education”. That this concept has arisen several places independently is quite telling.

Perception is Important

Too often students do not give themselves enough credit. They will get to the end of a course and feel like they may not have learned very much. A study was conducted that used an end of course survey to collect data about how much students thought that they had learned. When goals were not clearly defined at the beginning (as they are not more often than not), students indicated that they thought they had not learned very much. In the same course taught by the same instructor, this time with clearly defined goals given at the beginning and throughout the course, students responses to how much they perceived that they had learned was markedly higher. Both sets of students actually learned the same materials! The point is that with some very trivial tweaks such as taking 10 seconds to define a goal, the students “feel better” about what they have done, and are hence less likely to feel as if they are wasting their time, and attrition rates should decrease.