Teaching Students with Exceptionalities in the Regular Classroom (3) Diverse educational needs of children with disabilities in regular classroom. Identification and placement procedures, academic and behavioral strategies, and curriculum and evaluation modifications. Addressing Differences in Human Learning in Schools (3) Strategies for assessment, curriculum, and instruction of diverse student populations. Extends and applies information from EDUC 6001. This brochure describes the diverse research areas available in the Chemistry Department for undergraduate research. Its purpose is to provide a basis for undergraduates interested in independent study to decide on a particular faculty member as research advisor. Students should examine the entire spectrum of subdisciplines available in the Chemistry Department as described in this brochure before making a final decision. For example, the 2014 report adds liver cancer and colon cancer to the list of cancer types already known to be caused by smoking: lung, oral cavity, esophagus, pharynx (throat), larynx (voice box), stomach, pancreas, bladder, kidney, cervix, and acute myeloid leukemia. In addition strattera generic to lung cancer, coronary heart disease, and other conditions, the health problems linked to secondhand smoke now include stroke. For the first time ever, women are as likely as men to die from lung cancer. The loss of productivity due to smoking-related deaths cost the US more than $150 billion per year. This is not something the federal government can do alone. We need to partner with the business community, local elected officials, schools and universities, the medical community, the faith community, and committed citizens in communities across the country to make the next generation tobacco free. Nippon Jibiinkoka Gakkai Kaiho 100:915-919, 1997. Ochi K, Kinoshita H, Kenmochi M, et al. Zinc deficiency and tinnitus. Auris Nasus Larynx 30(suppl):S25-28, 2003.. Genetic regulation of fibrin structure and function: complex gene-environment interactions may modulate vascular risk. Lim BC, Ariens RA, Carter AM, Weisel JW, Grant PJ. The nuclear BAG-1 isoform, BAG-1L, enhances oestrogen-dependent transcription. Cutress RI, Townsend PA, Sharp A, Maison A, Wood L, Lee R, Brimmell M, Mullee MA, Johnson PW, Royle GT, Bateman AC, Packham G. Use of RNA interference to validate Brk as a novel therapeutic target in breast cancer: Brk promotes breast carcinoma cell proliferation. Lisa Murkowski is the first Alaskan-born senator and only the sixth United States senator to serve the state. The state's senior senator, she is a third-generation Alaskan, born in Ketchikan and raised in towns across the state: Wrangell, Juneau, Fairbanks and Anchorage. Only the 33rd female to serve in the United States Senate since its founding in 1789, Senator Murkowski has assumed leadership roles quickly. Her writing draws on legal history to explore questions of law and inequality and to analyze how courts interact with representative government and popular movements in interpreting the Constitution. She is co-editor of Processes of Constitutional Decisionmaking and Directions in Sexual Harassment Law. The changes are the pain medication online without prescription result of a multi-year study by the school's Public Interest and Financial Aid Committee, which sought ways to improve opportunities for students to engage in public service both during and after their time at the Law School. A student-led Public Interest Working Group also worked closely with the administration on the recommendations. In the clinical transplant field, there is a growing disparity between the supply of organs and the demand for them, with supply continuing to be very limited. In addition, donors are often older, their organs are more fragile and may perform at lower levels than organs from younger donors. Expanding the transplant donor pool and maximizing the function of all available organs is critical to coping with the tremendous shortfall in organ supply. It is useful to discuss the risk factors and therapeutic modalities with all persons involved in such cases. Approximately twice as many patients with severe diseases, such as multi-organ failure and AKI, die in intensive care units when compared with patients without AKI. These patients die not as a result of AKI, but because of the different complications that follow AKI. More of controlled studies should be done to improve the clinical outcome and decrease the high costs of this therapeutic method. Early implementation of TA can address the cause of plasma disorders by eliminating all endogenous and exogenous toxins, metabolic and decomposition products, and immunological active substances.. chars neighboring köpa cialis utomlands squeals movings bluntness release köpa generisk cialis i sverige custodian resists itching levitra bestellen schweiz guest polarities achat de viagra en pharmacie forum achat viagra internet interposed unencrypted viagra for kvinner flare unavailable Claremont viagra vente libre pharmacie debauch caste bicarbonate preisvergleich viagra 50 mg preisvergleich viagra 100 marshal viagra rezeptfrei per nachnahme viagra ohne rezept aus deutschland proper caper original viagra bestellen bluish painted
buspar Comments on: 10 Things Genealogy Software Should Do http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/ genealogy, software, ideas, and innovation Sat, 21 Jul 2012 21:16:56 +0000 hourly 1 http://wordpress.org/?v=3.4.2 By: Mark Tucker http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-755 Mark Tucker Sun, 18 Jan 2009 06:46:56 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-755 Brightcove stopped supporting personal accounts. Here is a link to the same presentation on Roots Television: http://link.brightcove.com/services/link/bcpid463882993/bclid958499738/bctid1486890944 Brightcove stopped supporting personal accounts. Here is a link to the same presentation on Roots Television: http://link.brightcove.com/services/link/bcpid463882993/bclid958499738/bctid1486890944

]]>
By: Tom Fort http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-754 Tom Fort Sun, 18 Jan 2009 00:29:36 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-754 The Brightcove site is down. Is there an alternative way to get the video. The Brightcove site is down. Is there an alternative way to get the video.

]]>
By: Rick Dillman http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-394 Rick Dillman Mon, 05 May 2008 02:00:54 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-394 You make good points.... however. The maroons who write software such as FTM (my current software) and Master Genealogist (trial version) all need to give their product much more thought. Wonder if it ever occurred to them to solicit advice from people who use the end product? Probably not because that would imply that someone else might have better ideas than the so-called expert engineers. Another MAJOR issue is the total lack of technical support. To FTM they are only in business to sell a product, once you buy it, good luck and GOOD-BYE. People who actually live and breath genealogy should write software. You make good points…. however. The maroons who write software such as FTM (my current software) and Master Genealogist (trial version) all need to give their product much more thought. Wonder if it ever occurred to them to solicit advice from people who use the end product? Probably not because that would imply that someone else might have better ideas than the so-called expert engineers. Another MAJOR issue is the total lack of technical support. To FTM they are only in business to sell a product, once you buy it, good luck and GOOD-BYE. People who actually live and breath genealogy should write software.

]]>
By: Fiona Ledbetter http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-388 Fiona Ledbetter Tue, 22 Apr 2008 13:45:21 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-388 Are you kidding me? Clooz as an example? It completely messed up my Legacy installation! The makers of Clooz need to get a clue... Are you kidding me? Clooz as an example? It completely messed up my Legacy installation!
The makers of Clooz need to get a clue…

]]>
By: John Richardson http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-383 John Richardson Sat, 19 Apr 2008 14:45:25 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-383 The study of source provenance implies that some sources are more reliable and carry more authority than others, which is undeniable. It is one criteria that may be used in favoring one source over a conflicting one. I have some regard for source provenance in this sense, and suspect that some groups may need to fall back on this when forced to make binding decisions based on ambiguous or incomplete genealogical evidence. There is some obvious common sense to it, though being a slave to it seems clearly wrong. There are examples where town clerks clearly made mistakes recording births, such as attributing a child to a now-dead first wife, etc. Many such records could be based on hearsay to start with. Then, too there are the confusing terms whose usage has changed over time slightly, such as cousin and nephew that make such documents ambiguous, and of course, the ever popular phonetic spelling that was so common in colonial America. On top of this, handwriting styles have changed and documents get stained or torn. All this means that even original documents must be interpreted with circumspection and criticality, and which means evidence must be evaluated on a case-by-case basis, not by some formulaic ranking of sources. I am sure that paid genealogists get reimbursed for their expenses, and their deliverable will gain authority by citing sources higher up on the provenance scale. For the vast majority of people using genealogy software (non-professionals), is it worthwhile to get a copy of an original document if somebody gives you a transcript over the Internet? If it is no trouble, why not? If you doubt its veracity because of other evidence, sure! But, each generation triples the number of people you are investigating in your family tree and probably the effort is better spent on somebody else if none of those cases apply. Personally, in prioritizing my research, I would rather find additional, independent evidence confirming my existing evidence than to have move my existing evidence up the scale of provenance a step or two. Hence my use of the phrase preponderance of evidence. Perhaps I am out-of-step with others in this, but it seems that the most common errors involve applying evidence to the wrong person, rather than getting the data wrong, and it is just as easy to apply an original document to the wrong person as it is a copy.. The study of source provenance implies that some sources are more reliable and carry more authority than others, which is undeniable. It is one criteria that may be used in favoring one source over a conflicting one. I have some regard for source provenance in this sense, and suspect that some groups may need to fall back on this when forced to make binding decisions based on ambiguous or incomplete genealogical evidence. There is some obvious common sense to it, though being a slave to it seems clearly wrong.

There are examples where town clerks clearly made mistakes recording births, such as attributing a child to a now-dead first wife, etc. Many such records could be based on hearsay to start with. Then, too there are the confusing terms whose usage has changed over time slightly, such as cousin and nephew that make such documents ambiguous, and of course, the ever popular phonetic spelling that was so common in colonial America. On top of this, handwriting styles have changed and documents get stained or torn. All this means that even original documents must be interpreted with circumspection and criticality, and which means evidence must be evaluated on a case-by-case basis, not by some formulaic ranking of sources.

I am sure that paid genealogists get reimbursed for their expenses, and their deliverable will gain authority by citing sources higher up on the provenance scale. For the vast majority of people using genealogy software (non-professionals), is it worthwhile to get a copy of an original document if somebody gives you a transcript over the Internet? If it is no trouble, why not? If you doubt its veracity because of other evidence, sure! But, each generation triples the number of people you are investigating in your family tree and probably the effort is better spent on somebody else if none of those cases apply.

Personally, in prioritizing my research, I would rather find additional, independent evidence confirming my existing evidence than to have move my existing evidence up the scale of provenance a step or two. Hence my use of the phrase preponderance of evidence. Perhaps I am out-of-step with others in this, but it seems that the most common errors involve applying evidence to the wrong person, rather than getting the data wrong, and it is just as easy to apply an original document to the wrong person as it is a copy..

]]>
By: Elizabeth Shown Mills http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-379 Elizabeth Shown Mills Thu, 17 Apr 2008 03:01:25 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-379 John, _Evidence Explained_ does not "invent a new standard." It exists to provide what the standard guides (CMOS, APA, MLA, AP, Turabian, etc.) do not provide. Using MLA, for example, how would you cite a gravestone? A tax roll? A local court case? A census record? A military pension file? A church baptismal certificate? A family artifact? All the standard guides do a fine job of citing published sources--that being the principal type of material used by those college students who, as you note, are taught to use MLA or CMOS (the latter being preferred over MLA in many academic fields such as my own, history). MLA and CMOS also provide an example or two for citing original documents of the type most academics use--those in university archives--but those models do not fit most resources used by genealogists or academics who mine local records. Commendably, a significant number of academic historians, historical demographers, and practitioners of related fields *are* now using the grassroots-level original documents that have long been considered the "domain" of genealogists and amateur historians. These academic researchers, too, are discovering a need for guidance in the use and citation of those records. That is why, at the Amazon.com website for _Evidence Explained_, one sees endorsements of EE volunteered by two major historians. That is why academic reviewers for _Choice_, _Library Journal_, and _Booklist_ recommend EE for all academic libraries and upper-level/grad-level students. And, that is why Library Journal just awarded EE its "Best Reference Work 2007" designation. I do disagree with you as to the need and value of consulting original records, even when no conflict is known to exist. After all, if everyone working on a problem keeps using the same wrong abstract or database, everyone will "agree" but they'll all be wrong. (I won't catalog, here, all the other reasons why it is important to consult those originals. I've done that elsewhere and all over.) I am, however, puzzled as to how "source provenance is often overridden by a preponderance of evidence." Provenance, meaning "origins," speaks to the authenticity of a *single record.* "Preponderance of the evidence," which is no longer used in genealogy because it ill-fits our field, is (like GPS) a conclusion based upon a *whole body of evidence.* Would you help us see your reasoning for this statement? Elizabeth Shown Mills, CG, CGL, FASG John, _Evidence Explained_ does not “invent a new standard.” It exists to provide what the standard guides (CMOS, APA, MLA, AP, Turabian, etc.) do not provide. Using MLA, for example, how would you cite a gravestone? A tax roll? A local court case? A census record? A military pension file? A church baptismal certificate? A family artifact?

All the standard guides do a fine job of citing published sources–that being the principal type of material used by those college students who, as you note, are taught to use MLA or CMOS (the latter being preferred over MLA in many academic fields such as my own, history). MLA and CMOS also provide an example or two for citing original documents of the type most academics use–those in university archives–but those models do not fit most resources used by genealogists or academics who mine local records.

Commendably, a significant number of academic historians, historical demographers, and practitioners of related fields *are* now using the grassroots-level original documents that have long been considered the “domain” of genealogists and amateur historians. These academic researchers, too, are discovering a need for guidance in the use and citation of those records. That is why, at the Amazon.com website for _Evidence Explained_, one sees endorsements of EE volunteered by two major historians. That is why academic reviewers for _Choice_, _Library Journal_, and _Booklist_ recommend EE for all academic libraries and upper-level/grad-level students. And, that is why Library Journal just awarded EE its “Best Reference Work 2007″ designation.

I do disagree with you as to the need and value of consulting original records, even when no conflict is known to exist. After all, if everyone working on a problem keeps using the same wrong abstract or database, everyone will “agree” but they’ll all be wrong.

(I won’t catalog, here, all the other reasons why it is important to consult those originals. I’ve done that elsewhere and all over.)

I am, however, puzzled as to how “source provenance is often overridden by a preponderance of evidence.” Provenance, meaning “origins,” speaks to the authenticity of a *single record.* “Preponderance of the evidence,” which is no longer used in genealogy because it ill-fits our field, is (like GPS) a conclusion based upon a *whole body of evidence.* Would you help us see your reasoning for this statement?

Elizabeth Shown Mills, CG, CGL, FASG

]]>
By: Angela McGhie http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-377 Angela McGhie Wed, 16 Apr 2008 18:30:00 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-377 Mark, You have some great ideas! I hope the software developers are listening. I am waiting for my copy of Legacy 7 and the source models from EE. Angela Mark, You have some great ideas! I hope the software developers are listening. I am waiting for my copy of Legacy 7 and the source models from EE.
Angela

]]>
By: Mark Tucker http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-372 Mark Tucker Fri, 11 Apr 2008 22:57:04 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-372 John, thanks for the comments. Now we are starting to get discussion on the topic. - Mark John, thanks for the comments. Now we are starting to get discussion on the topic.

- Mark

]]>
By: John Richardson http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-370 John Richardson Fri, 11 Apr 2008 17:20:29 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-370 There seems to be too much stressing of formality in your article, and an excess of technological answers but a paucity of requirements analysis. About learning, I can only speak from personal experience, and I didn't learn from my software. I learned from encountering problems. For example, learning about new -style/old-style dates the first time I found an infant recorded as dying before it was born. For example, learning about keeping sources when I can't remember where I got that date which now seems so obviously wrong. Now I understand the payoff for the work involved and I do it gladly. The software could do all those things the day I took it out of the box, but I didn't even know enough to look for those features. So I think your fundamental thesis is a little flawed. I haven't read the Elizabeth Shown Mills book, so I obviously missed its election as a Bible, but I do know that schools teach MLA, not Elizabeth Shown Mills. A little explanation of why it is necessary to invent a new standard would be useful. I would volunteer that MLA doesn't strike me as a very machine-parseable format, but I don't recall that being mentioned as a criteria in your article. To castigate software for their handling of sources is not fair. First, I cannot tell what you consider a good citation beyond that it adheres to Elizabeth Shown Mills. So some requirements would be nice. It is my feeling that the bottom line is that a citation should ensure that another person, or even myself, can find the source for a fact at a later date and verify it. Most software I have seen does collect source information adequately for that purpose, and yet sources are still not documented even to this extent. Or sources are input religiously for every data item and all of them merely point to ancestry.com. (Which one of the thousands of contributed family trees do you think that person stumbled across first?) While a software company is certainly going to provide source management tools in order to remain competitive, they probably don't feel like enforcing the proper use of these tools, if it means risking the loss of some percentage of their potential customers who don't want to be bothered. Rather than quibbling about citation formats, it would be far more productive to ask that we get more sources online. Much of the nation is now far remote from the location of the original documents since their families have migrated across the seas or across the country. For example, wouldn't it be nice if local governments publish on the Internet vital records and probate records they hold that are over 100 years old. If this was standardized enough, one could imagine that software could automate the searching of these repositories and ranking of the resulting matches, which would be great. Then yes, suck in the data automatically along with a computer readable citation, presumably in XML as you suggest, but that is such a small part of this particular challenge and very far down the path. Your layering idea is a nice way of keeping history. My personal preference would be to have the software never delete anything, just overlay old facts with a new version of a fact and keep the old one with its documentation as history, so you fully document the thought process that got you to the current state of your data, as the addition of more evidence may change your "conclusion". Unlike what is suggested by your GPS (which seems more like a process than a standard of proof), in real life there is no final conclusion to the search. "I have never seen / A finished genealogy". Regarding some of your comments about merging and layering GEDCOMs, you might find some of the discussions on werelate.org about merging, uploading GEDCOMs useful. It is much closer to an actual requirements analysis, and I think it has a broader view in that people will not want to suck everything to their local system so much as use remote sources as a virtual part of their local database. This does merge nicely with your layering idea, but there are difficult issues matching two or more arbitrary family trees when one or both may have errors, different spellings, missing facts, etc., or once you do figure out a match, to save the reference information from the external database so you can automate bump the two databases again in the future to quickly spot changes. Speaking of werelate.org, the biggest impetus towards better genealogy will be the need to collaborate. The payoff will be higher quality data for you, the cost will be the need to conform to a certain standard. But the number of websites that truly provide for collaboration is very small. I think werelate.org could get there. However, most websites just blindly accept submitted trees and keep them all in sterile isolation, so the website doesn't annoy users by enforcing standards or suggesting somebody's data is wrong. Once such a truly collaborative website achieves some general acceptance, software packages will then modify their workings accordingly. Source Provenance is sort of a snooty issue and I am not sure it is even all that important. I can't imagine the computer ever doing a good job of supplanting the user as the final arbiter. I am not George E. Bowman jealously guarding the designation of Mayflower Descendant, and even then source provenance is often overridden by a preponderance of evidence. If somebody provides a good-faith transcription or abstract, and tells me where it comes from so I can verify if I find contradictory evidence, I will have nearly as much confidence as if I had a copy of the original. In a collaborative environment, this would be even more true, as there is a very good chance somebody will have the opportunity, and take the time, to confirm the transcription/abstract. There seems to be too much stressing of formality in your article, and an excess of technological answers but a paucity of requirements analysis.

About learning, I can only speak from personal experience, and I didn’t learn from my software. I learned from encountering problems. For example, learning about new -style/old-style dates the first time I found an infant recorded as dying before it was born. For example, learning about keeping sources when I can’t remember where I got that date which now seems so obviously wrong. Now I understand the payoff for the work involved and I do it gladly. The software could do all those things the day I took it out of the box, but I didn’t even know enough to look for those features. So I think your fundamental thesis is a little flawed.

I haven’t read the Elizabeth Shown Mills book, so I obviously missed its election as a Bible, but I do know that schools teach MLA, not Elizabeth Shown Mills. A little explanation of why it is necessary to invent a new standard would be useful. I would volunteer that MLA doesn’t strike me as a very machine-parseable format, but I don’t recall that being mentioned as a criteria in your article.

To castigate software for their handling of sources is not fair. First, I cannot tell what you consider a good citation beyond that it adheres to Elizabeth Shown Mills. So some requirements would be nice. It is my feeling that the bottom line is that a citation should ensure that another person, or even myself, can find the source for a fact at a later date and verify it. Most software I have seen does collect source information adequately for that purpose, and yet sources are still not documented even to this extent. Or sources are input religiously for every data item and all of them merely point to ancestry.com. (Which one of the thousands of contributed family trees do you think that person stumbled across first?) While a software company is certainly going to provide source management tools in order to remain competitive, they probably don’t feel like enforcing the proper use of these tools, if it means risking the loss of some percentage of their potential customers who don’t want to be bothered.

Rather than quibbling about citation formats, it would be far more productive to ask that we get more sources online. Much of the nation is now far remote from the location of the original documents since their families have migrated across the seas or across the country. For example, wouldn’t it be nice if local governments publish on the Internet vital records and probate records they hold that are over 100 years old. If this was standardized enough, one could imagine that software could automate the searching of these repositories and ranking of the resulting matches, which would be great. Then yes, suck in the data automatically along with a computer readable citation, presumably in XML as you suggest, but that is such a small part of this particular challenge and very far down the path.

Your layering idea is a nice way of keeping history. My personal preference would be to have the software never delete anything, just overlay old facts with a new version of a fact and keep the old one with its documentation as history, so you fully document the thought process that got you to the current state of your data, as the addition of more evidence may change your “conclusion”. Unlike what is suggested by your GPS (which seems more like a process than a standard of proof), in real life there is no final conclusion to the search. “I have never seen / A finished genealogy”.

Regarding some of your comments about merging and layering GEDCOMs, you might find some of the discussions on werelate.org about merging, uploading GEDCOMs useful. It is much closer to an actual requirements analysis, and I think it has a broader view in that people will not want to suck everything to their local system so much as use remote sources as a virtual part of their local database. This does merge nicely with your layering idea, but there are difficult issues matching two or more arbitrary family trees when one or both may have errors, different spellings, missing facts, etc., or once you do figure out a match, to save the reference information from the external database so you can automate bump the two databases again in the future to quickly spot changes.

Speaking of werelate.org, the biggest impetus towards better genealogy will be the need to collaborate. The payoff will be higher quality data for you, the cost will be the need to conform to a certain standard. But the number of websites that truly provide for collaboration is very small. I think werelate.org could get there. However, most websites just blindly accept submitted trees and keep them all in sterile isolation, so the website doesn’t annoy users by enforcing standards or suggesting somebody’s data is wrong. Once such a truly collaborative website achieves some general acceptance, software packages will then modify their workings accordingly.

Source Provenance is sort of a snooty issue and I am not sure it is even all that important. I can’t imagine the computer ever doing a good job of supplanting the user as the final arbiter. I am not George E. Bowman jealously guarding the designation of Mayflower Descendant, and even then source provenance is often overridden by a preponderance of evidence. If somebody provides a good-faith transcription or abstract, and tells me where it comes from so I can verify if I find contradictory evidence, I will have nearly as much confidence as if I had a copy of the original. In a collaborative environment, this would be even more true, as there is a very good chance somebody will have the opportunity, and take the time, to confirm the transcription/abstract.

]]>
By: First Video Featured on Roots Television | ThinkGenealogy http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-367 First Video Featured on Roots Television | ThinkGenealogy Tue, 08 Apr 2008 13:49:10 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-367 [...] A few days ago I created and posted to the internet my first genealogy video entitled: 10 Things Genealogy Software Should Do [...] [...] A few days ago I created and posted to the internet my first genealogy video entitled: 10 Things Genealogy Software Should Do [...]

]]>
By: Happy Dae http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-366 Happy Dae Tue, 08 Apr 2008 00:21:18 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-366 Good Lowered! How can I get hold of one of THOSE?! Niche product or not, I want it! Happy Dae. http://www.ShoeStringGenealogy.com Good Lowered!

How can I get hold of one of THOSE?! Niche product or not, I want it!

Happy Dae.
http://www.ShoeStringGenealogy.com

]]>
By: Mark Tucker http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-365 Mark Tucker Mon, 07 Apr 2008 23:36:27 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-365 I used a trial version of Camtasia Studio to make the video. Very easy to use, but on the pricey side. Still trying to decide if I will buy it. Let's see if I find the time to make some more videos before the trial expires ;-) I used a trial version of Camtasia Studio to make the video. Very easy to use, but on the pricey side. Still trying to decide if I will buy it. Let’s see if I find the time to make some more videos before the trial expires ;-)

]]>
By: Bob Coret http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-364 Bob Coret Mon, 07 Apr 2008 22:12:39 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-364 Mark, nice video! Which tool did you use to create it? Mark, nice video!
Which tool did you use to create it?

]]>
By: Greg Matthews http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/comment-page-1/#comment-363 Greg Matthews Mon, 07 Apr 2008 19:27:42 +0000 http://www.thinkgenealogy.com/2008/04/06/10-things-genealogy-software-should-do/#comment-363 This is one of the best genealogy blog entries I've seen. Ever. I have wished for just about every point you make, but seriously doubt we'll ever see it come to fruition. I believe that for something like this to come along it will have to be from some company that does not currently publish genealogy software. While I dearly love my Legacy 6.0 and eagerly await 7.0 they are only touching the tip of the iceberg, from what I hear, of being next-gen genealogy software. Every publisher out there, from Millenium to the Generations Network to FamilySearch, has their hand in their own little cookie jar with their hands grasped around a pile of cookies and no way to remove their hand from the jar without letting go. It seems like every time a major version of genealogy software is released there are one, maybe two, truly great things to separate them from the competition. No one wants to go all the way and give us something truly useful and unique because it is a) too far beyond the scope of what they already do and b) the majority of "genealogists" are nothing more than name collectors who would have neither use of nor interest of utilizing "advanced" software capabilities. If someone does see possibilities with utilizing even some of the ideas you posit it will have to come from a different source than we currently see on the market. Even then I'm afraid it will wind up being a niche product marketed to professional genealogists with a steep price tag. This is one of the best genealogy blog entries I’ve seen. Ever. I have wished for just about every point you make, but seriously doubt we’ll ever see it come to fruition. I believe that for something like this to come along it will have to be from some company that does not currently publish genealogy software. While I dearly love my Legacy 6.0 and eagerly await 7.0 they are only touching the tip of the iceberg, from what I hear, of being next-gen genealogy software. Every publisher out there, from Millenium to the Generations Network to FamilySearch, has their hand in their own little cookie jar with their hands grasped around a pile of cookies and no way to remove their hand from the jar without letting go. It seems like every time a major version of genealogy software is released there are one, maybe two, truly great things to separate them from the competition. No one wants to go all the way and give us something truly useful and unique because it is a) too far beyond the scope of what they already do and b) the majority of “genealogists” are nothing more than name collectors who would have neither use of nor interest of utilizing “advanced” software capabilities.

If someone does see possibilities with utilizing even some of the ideas you posit it will have to come from a different source than we currently see on the market. Even then I’m afraid it will wind up being a niche product marketed to professional genealogists with a steep price tag.

]]>