+priv

http://radio.weblogs.com/0114065/categories/myProfession/

Friday, April 30, 2004 RE: Battelle on Google’s S-1: http://radio.weblogs.com/0114065/categories/myProfession/2004/04/30.html#a481

liquid labour day


do spuit de musi:

1. paths /data/www/virtual/netfrag/sites/public/rssreader /home/service/bin

2. urls http://www.netfrag.org/rss/2.0/nfo.yakka.rdf http://www.netfrag.org/rss/1.0/nfo.yakka.rdf


doesn't work: 6 Mar 2003 22:57:40 GMT 13 Feb 2003 09:04:19 GMT

works: Tue, 11 Mar 2003 01:01:02 +0100

example pages/links for RssReader?

error (no patchsets seem to be generated) /usr/local/bin/xcvs -c /etc/xcvs.conf ahhh, it's /etc/xcvs.config

rss 1.0: "rss-merge.xsl" requires "<item"s having an "about" attribute!!! it seems to get used as some id/key:

[...]

[...]

[...]

=> editing rss2rdf.xsl accordingly we don't have an url, so we use the title


Issue with CVS + RSS:

propagate url for CVSView from somewhere (begin) via news to rss "resource" attribute to have a proper url for:


Q: Better use cvs2rss??? Check it out!


little infrastructure to have list of "newsgroups of interest" (for clients) at /etc/news/

build email-obfuscators into web-scripts (e.g. newsportal: "From: " header)

integrate "left" and "right" (or previous/next?) links into newsportal's "article.php"

flow: cvs commit -> cvsspam -> Mail::Dispatch::Gateway::News -> innd

add long versions of commands to "shortcuts::files" (e.g. f2s -> file2string) write module shortcuts::files::php|perl|basic|pascal which mimics file handling semantics of these languages/platforms

describe .symlinkrc

news-items which were converted to rss-items: where do we get additional (esp. "link") metadata from? could a parser for nfo.links.* be feasable? yup - should be: parse from e.g. "Content-Location:" or "Content-Base:" if MIME message no! don't use them: unfortunately they appear to have random blanks in them (where from? how does this?) e.g.: Content-Base: "http://support.lis.uiuc.edu/documentat ion/systems/RAID--Initial_Partition ing.html" Content-Location: "http://support.lis.uiuc.edu/documentat ion/systems/RAID--Initial_Partition ing.html"

maybe use text from the first part of the mime-message: http://support.lis.uiuc.edu/documentation/systems/RAID--Initial_Partitioning.html

or we have to assume "http://" (or parse the protocol from Content-Base/Location) and append the filename from "Content-Disposition", e.g.:

Content-Disposition: inline; filename="support.lis.uiuc.edu/documentation/systems/RAID--Initial_Partitioning.html"

... which seems contiguous.


ftp://ftp.suse.com/pub/suse/i386/9.1/boot/boot.iso


/usr/bin/perl /usr/lib/cgi-bin/cvsmonitor/cvsexec.pl -v -l checkout netfrag.nfo



lamb - heaven - six feet under soundtrack

daemon - turns other processes into daemons
fastjar - Jar creation utility
java-common - Base of all Java packages
java2-common - Common facilities for all Java2 environments
jython - Python seamlessly integrated with Java
jython-doc - Jython documentation including API docs
sablevm - Free implementation of Java Virtual Machine (JVM) second edition
kaffe - A JVM to run Java bytecode
kaffe-common - Files shared between all Kaffe VM versions


/data/www/virtual/netfrag/sites/public/rss/

full list:

first test:


rss2rdf

It turns out that the RSS feeds news2rss generates are RSS 2.0 ones. TWiki produces RDF 1.0. The rss-merge XSLT 1.1 stylesheet from http://ex-code.com/rss-merge/ however operates on RDF 1.0 as well. Solution? Find something to convert RSS 2.0 to RDF 1.0 or take completely different approach with RSS feed merging.


http://blogspace.com/rss/tools mod_index_rss Author: Brian Aker An Apache module to display directories as RSS feeds. [ info ]


outline dispatchmail, rss generation (create, convert (->rss 1.0), merge) * url input for http://netfrag.org/rssreader/


xslt processing - top-snapshots:

saxon:
10090 service    8   0  3896 3896   968 R    10.8  2.0   0:00 sablevm
10090 service   12   0  8732 8732   968 R    16.7  4.5   0:00 sablevm
10090 service    8   0 19352  18M   972 R    10.9 10.0   0:01 sablevm
10091 service    8   0 18160  17M   972 R    47.4  9.3   0:01 sablevm
10091 service   10   0 11928  11M   972 R    20.5  6.1   0:02 sablevm
10091 service    8   0 21328  20M   972 R    79.7 11.0   0:04 sablevm
10091 service   12   0 30620  29M   972 R     0.4 15.8   0:08 sablevm
10091 service   18   0 24232  23M   972 R    68.0 12.5   0:10 sablevm
10091 service   18   0 31160  30M   972 R    88.7 16.1   0:17 sablevm
10095 service   14   0 24064  23M   972 R    82.3 12.4   0:04 sablevm
10096 service   10   0 28804  28M   972 R    88.7 14.8   0:11 sablevm

sablotron:


libxslt/xsltproc:
10033 service   14   0 30688  29M  1136 R    37.1 15.8   0:02 xsltproc
10033 service    8   0 48528  47M  1136 R    11.9 25.0   0:03 xsltproc
10033 service   14   0 59852  58M  1136 R     5.2 30.9   0:04 xsltproc
10033 service   16   0 73012  71M  1136 R     3.8 37.7   0:04 xsltproc


times:
#> time ./rss2rdf nfo.yakka nfo.dev.perl nfo.faq.users nfo.links.computing nfo.links.misc

saxon:
real    3m20.816s
user    0m43.577s
sys     0m2.385s

sablotron:
real    23m40.927s
user    0m13.019s
sys     0m7.346s
2x ./rss2rdf: line 1: 10165 Segmentation fault      $sabcmd $2 $1 $3 param-rss-max-items=30
1x ./rss2rdf: line 1: 10246 Killed                  $sabcmd $2 $1 $3 param-rss-max-items=30

libxslt/xsltproc:
real    3m27.296s
user    0m8.769s
sys     0m1.346s

Unfortunately "time --format=..." didn't work....

It appears that large input files make the xslt-processing (regardless of engine) spin around. So lets cut these files....

(Possible) Solutions:


&lt;Element id="title" type="string" min="1" max="100"/&gt;
&lt;Element id="description" type="string" min="1" max="500"/&gt;
&lt;Element id="url" type="string" min="1" max="500"&gt;
&lt;/Element&gt;
&lt;Element id="link" type="string" min="1" max="500"&gt;
&lt;/Element&gt;


http://ftp.gwdg.de/pub/linux/suse/ftp.suse.com/suse/i386/9.1/ 134.76.11.100

http://www.interlog.com/~verbum/PNNN____lit/SNNN____chronometry.html http://www.netadmintools.com/art235.html


XInclude + saxon (Xerces):

http://www.gungfu.de/facts/archives/2004/05/23/dokumentmodularisierung-mit-xinclude/ http://www.goshaky.com/docbook-tutorial/ch01s04s05.html#d0e1250 http://www.mnot.net/blog/2003/10/02/modularity_by_reference

test with xmllint!!! #> xmllint --xinclude eingabe.xml > ausgabe.xml



xcvs did not generate patchsets due to unsufficient privileges for "service" accessing /var/xcvs/patches/ Had to delete some rows from database "xcvs_nfo" to regenerate lost ones.... DELETE FROM patchset WHERE patchset_id >= 1749; DELETE FROM revision WHERE patchset_id >= 1749;

and then run xcvs again: #> xcvs -c /etc/xcvs.config

... now works!


"ShowChanges" button for each item at RssReader

-- Main.joko - 07 Sep 2004