User talk:Thepigdog

From Knowino
Jump to: navigation, search

Welcome to this club! Here your originality should feel much better than on WP. --Boris Tsirelson 10:14, 13 February 2011 (EST)

Welcome! :-) —Tom Larsen (talk) 18:52, 13 February 2011 (EST)
And please sign your message(s) on Talk:Inductive Inference (and everywhere) by ~~~~. --Boris Tsirelson 01:56, 4 March 2011 (EST)

Contents

[edit] Visual Data Array

This article is empty, do you intend to work on it, or will I delete it for you?--Paul Wormer 11:29, 13 June 2011 (EDT)

Yes please delete that page. Thanks Thepigdog 07:54, 14 June 2011 (EDT)

[edit] Wiki categories

What so you think of the use of categories, as in (for example),

Thepigdog 05:36, 9 September 2011 (EDT)

We have for now a rather ill-organized set of categories. About new category "Symbolic Logic:Axiom Theorem Systems" I think that you'd better create first the category "Symbolic Logic" and then, if needed, its subcategories. --Boris Tsirelson 07:41, 9 September 2011 (EDT)

[edit] Tree Rules EXtractor

"In 2001, Maieutica was replaced by T. Rex (Tree Rules EXtractor), a software generator containing the needed interfaces to allow a not computer scientist to develop himself a program (as an expert systems), to integrate it in applications and to manage database by reasoning, all without programming language." Seeing this phrase here I wonder, maybe you have some interest and/or opinion about it? --Boris Tsirelson 15:11, 9 November 2011 (EST)

Thanks for that. I will have a look. :) Thepigdog 19:07, 11 November 2011 (EST)

Unfortunately I do not speak french. I google translated iajpl.pdf it. There is not a lot of explanantion of the underlying algorithm. It seems to be a learning system used in fault detection.

The five laws of Logic Flow:

Maybe something lost in translation here. Thepigdog 01:57, 12 November 2011 (EST)

I see... and I do not read French, as well. --Boris Tsirelson 11:01, 13 November 2011 (EST)
The above seems approximately like a neural network feed back mechanism but applied to logic. Thepigdog 00:13, 26 November 2011 (EST)

[edit] Learning in Inheritance Hierarchies

Just brainstorming here.

If you have a set of tuples of attributes, each tuple representing an object or thing, then you could classify the things into an inheritance tree.

Let an inheritance tree be a multiway tree made up of treenodes. Each TreeNode has attributes

Then ClassifyTuple is

Where TreeNode.AddTuple(Tuple newTuple) is

I think this algorithm may be improved further. It leaves Tuples at the nodes when it should leave idealised averaged tuples at the nodes and have actual Tuples at the leaves. This would lead to more complexity in the algorithm and the need for some kind of balancing.

Each attribute of the tuple could record a null value, or an average and standard deviation (for a quantity) or a set of values (for discrete values).

The inheritance tree could be used to predict missing information from a partially populated test tuple.

Time to google this and see what other people have done.

Thepigdog 00:13, 26 November 2011 (EST)

This is Hierachical Clustering. The original algorithms do not do a probability analysis. Bayesian Hierarchical Clustering provides a probalistic model, and so may be equivalent to what I am attempting to describe.

The principle of minimizing complexity gives the most probable clustering.

Thepigdog 01:59, 26 November 2011 (EST)

Personal tools
Namespaces
Variants
Actions
Navigation
Community
Toolbox