netfrag.org . Pad . Joko-2002-11


- refactor Data:: to DTS::
  - mk PerlDTS  (Perl Data Transportation System)
  - DTS::Locator
  - DTS::Transport
  - DTS::Transform
  
- Perl Enterprise -> EnterPerl??? -> EasyPerl???
    -> compare enterprise as adjective (speed, security, flexibility, scalability)
    -> compare as "used in the enterprise"...     "how many enterprises get their job done with perl"
    -> compare with u.s.s. enterprise ;)
    
- re-check-in to appropriate place in cvs:
    - old progressive-libraries
    - old lash-la-rue website
    - new lash la rue website
    - netfrag.org itself?
    - quick2pick-libraries / homepage

- create for twiki
  - some templates for new pages
  - some forms for creating new pages
  - some "actions"
    - new announcement
    - new project
      - new head  -> announcement
      - new release  -> announcement
    - new article
    - new story
    -----------------------    
    - new url
    - new annotation
    - new comment
    - 
    
- how to cope with zope (via wddx?) http://zope1.devshed.com/zope.devshed.com/Server_Side/Perl/WDDX/page5.html

- computer supported cooperative work

- published FileSelector for glimmer

- inheritance for MetaL

- TWiki and subversion

- windows-gui for bw_acct

- write NEWS for netfrag.org
  - bw_acct improvement
  - glimmer plugin
  - horde patches
  - w2hfax patches / maintenance
  
- put things on TreeStack
  - ts
    - glimmer plugin "FunctionList"
    (- inheritance for MetaL)
    - mysqldiff improvement
    - EasyORM improvement
    - some perl libraries
    - some php libraries

- add global "notes" (like news://php.notes)

- use PHP_Doc

- Navision Financials

- database schema interop (write about & reference implementation!)
  - xmi
  - orml
  - MetaL
  - PDL
  - db native (sql)

- write about "daily":
    Python 2.2.1 (#34, Apr  9 2002, 19:34:33) [MSC 32 bit (Intel)] on win32
    CVSROOT: joko@quepasa.netfrag.org:/var/lib/cvs (ssh authentication)

- write: "ScanningTakesTooLongTime" or "ScheduledJobsNeeded"
  - full text searches need an index
  - logfile-analyzer take time to parse and render
  - system usage overviews (relating to disk space usage and stuff)
  - => build "collectors" for the time until computers are fast/capable enough
  
- write generic "daemonize.pl"-script / Perl-Module (Service::Daemonize)
  - create startup-script for init.d from template
  - can generate bash-, perl- and maybe even php-scripts for system startup
    - handles common start-/stop - actions
    - logs in a common format to a known destination (syslog?)
    - properly reports result on actions
  - optionally creates (process-) monitoring- and accounting-scripts
      (Service::Monitor, Service::Account; ::Quota, ::Stats, ::Quota::CpuUsage, ::Quota::DiskUsage, ::Quota::FsAccess)
  - use examples from h1:
    - bash: /etc/init.d/mysqld, /etc/init.d/netsaintd
    - perl: /etc/init.d/umlhelo, /etc/init.d/quepasa
    - look at common service start-/stop - scripts/concepts from SuSE, Red Hat and Debian (start-stop-service!!!)
  - abstract this: either pid from pidfile is used or remembered on process creation (custom pid buffering)
  
- proposal for nfo's "Boot" namespace (main level) in perl:
  - write an abstract Boot::Environment which handles
    - booting of database connections
    - starting of sub-processes (Local Terminals, (Remote-/Secure-) Shells, Apaches, POEs)
    - context detection and (include)-library-path initialization
    - some other preparations
  - refactor use_libs.pl to Boot::Environment::Context, maybe split into Boot::Environment::Context::Apache, etc.
  - refactor BizWorks::Boot to Boot::Environment::BizWorks (and think about another name for BizWorks at this point)
  - refactor (maybe) daemonize.pl into Boot::System::Service::Daemonize
  - think about other modules from cvs and/or development directories which might fall into this namespace, e.g.
    - Boot::System::Config::Patch (build as API for UML::ConfigPatcher:: namespace and remove parts from there)
      - refactor parts of UML::ConfigPatcher:: namespace into NFO::Core::File
    - Boot::System::Config::Factory (which is capable of building config-files from scratch)

- analysis of the startup phase of a linux system with heavy process use
  - use top, vmstat...., something about the filesystem, etc.
  - maybe investigate some optimizations
  - look at aspects of overall performance, time to (re-)availability and failures/races in startup phase
    - "races" you do normally have in applications, but on systems there many as well, e.g. firewall-startup-phase,....

- The YAHAA - Framework (Yet Another Highly Abstracted Application)
  - refactor BizWorks:: code to a YAHAA::Application (YAHAA::Raindrop)
  - refactor multiple daemons into YAHAA::Backplane (YAHAA::Cloud)
  
- apiapisti??? no! it was another word... hmmm....

- bad point with TWiki
  - no common url -> file mapping/relation from client view (technical aspect for e.g. robots)???

- handlers for Data::Storage
  - Storable
  - Data::Dumper
  - AnyData
  - interface to OLE::  ( see http://www.cpan.org/modules/by-category/07_Database_Interfaces/OLE/ )
  - interface to various IPC:: stuff
  
  
- Data::Storage is ...
  - ... just another wrapper api
  - ... one which really helps because it adds some neat stuff

- write Mail::Addressbook::Convert::Pst (Outlook)

- write perl source filter to add functionality to builtins to be able to handle referenced values
  - e.g.: foreach for references!
  
- wiring poe objects with tangram

- use umltools with IPC::Run

- rewrite of observer with POE::Component::IRC:: ( see http://www.funkplanet.com/POE/ )
   -> better already use ptirc or Perl-fu
   -> add kick-on-idle-feature

- investigate POE::Component::RRDTool

- look at POE::Component::SubWrapper, POE::Component::UserBase and POE::Session::Cascading

- test POE::Component::LaDBI with Tangram (Data::Storage)

- transport data via RPC::XML, SOAP - okay - but there are others:
  - MOP::MOP 

- port File::Remote to win32 somehow

- use Shell.pm (+Shell::SSH) with umltools

- new area on netfrag.org: bits of perl code

- patch to Perl's Shell.pm

- start project "autogather" (recieves/waits for links/pointers, gets them, adds them to index and send notification)

- write source-filter for perl which does a print "$sub_name" after each "sub <sub_name> {"

- ways to run programs from perl: do, system, ``, qq(), Shell, IPC::Run

- look at IPTables::IPv4

- UnpackHere (unpacks tars, zips, ...)

- look at DBIx::FullTextSearch

- 20:59 <[joko]> -> todolist: free php encoder

- start abstract/meta project (like poop) for establishing an intranet for a small/medium sized company 
  completely on top of opensource software - both on windows and linux (workstations) and a linux-server
  
- hibernate <-> perl???

- openchallenge

- write replacement for WinSCP

- TWiki -> chm compiler

- SendTo for Mahogany

- LDAP for Mahogany

- news.netfrag.org

- category.netfrag.org: publishing/content categories for www.netfrag.org
  - implemented via TWiki or arbitrary category database
  
- archive.netfrag.org / autoarchive.netfrag.org
  1. checks all incoming data for urls
  2. connects to them and maybe some links contained at target (inside page)
  3. writes them to database (metadata and content)
  4. web-interface has to be implemented
  
- put example pdl-file to vops

- create CASE tool out of:
  - tangram
  - dia: http://dia.sourceforge.net/
  - (maybe) dia2code: http://dia2code.sourceforge.net/
  - dia2schema (tangram): to be coded
  - see http://dia2code.sourceforge.net/diagrams/example1.dia
  
- http://www.thefreeworld.net/
   
- find most often visited links

- fix bugs:
  - bugs.gnome.org
  - bugs.debian.org
  - bugs.php.net

- use Rijndael!

- from http://www.dwheeler.com/essays/fosdem2002.html
    ".NET" is a marketing term in Microsoft. Under this umbrella there are lots of technologies, including Passport, etc.; that's not relevant to this talk. What's relevant is the ".NET Framework," which is what Mono will implement. This framework contains the following: 
    - A Virtual execution system (VES), like JVM but multi-lingual (C#, Visual Basic, Javascript...). Code is expressed in the Common Intermediate Language (CIL). 
    - A library 
    - A Language, C#. 
    
- look at corba, e.g. at
    - http://orbit-resource.sourceforge.net/
    
- chromagnon: http://java-gnome.sourceforge.net/screenshots/crontab-edit.jpg

- cgm for java: http://www.bdaum.de/howto.htm

- alternatively install cvsview on cvs.netfrag.org

- schema interop: XMI <-> tangramschema <-> dia

- netfrag.org - searchbar for internet explorer

- schemas / tangram:
  detect schema changes on all relational stuff (DBI)
  handle schema changes on oo style abstraction first (Tangram) 'cause it's important for round-trip-engineering

  common migration:
    1. alter table (schema-deploy+)
    2. eventually fill with default values

- publish
  categorize all detail tools as stuff ;) in order to make the big picture more clear (for bmbf?)
    - umltools
    - hrcontrol
    - oo-rdbms-mapper
    - glimmer functionlist
    - horde patches
    - bwacct patches
    - w2hfax patches

  refactoring
    - aim: 
       - encapsulate all generic libraries to a library called "flib" and publish this stuff
       - restructuring of boot.inc and e.php
    - resolved some still-left strict/direct dependencies with global variables

- look at Desktop/tangram-patches.txt

- write a Tangram::PhpDump like similar to the Tangram::PerlDump
  should store perl variables serialized in php style to table fields using PHP::Serializer::XYZ
  
- perl: use FindBin;

- mail-handler
  - derive from "recieveMail" and "gateMail"
  - what is it?
    - source(s) -> mail-handler -> target(s)
  - aims:
    - 1. multi - routing
      - possible sources are: smtp, pop3
      - possible targets are: /var/spool/mail/<username>, ~/Mail/mbox, fax, sms
    - 2. tracing & triggering
      - scan for certain messages and run specified scripts
    
- enhance flib/debugbox with some real debugging:
  - make a stamp (file:line) at each "dprint" and store these events (metadata) to a debug database including context dump
  - add http link to each debug entry pointing to a "debug.php" showing off debugging metadata and sourcecode
  
- refactor "search in listbox" script from quick2pick code (javascript) to reusable component
  - for citymapXYZ?
  
- babbled with fcode about a _very_ interesting library he wrote....  publish this thread on news.netfrag.org?

- netfrag.org:
  - publish "url shortcuts for accessing netfrag.org's cvs via web" at news.netfrag.org
  - implement core registry (getUsers, getProjects)
  - set gui on top ( on each user: approve content )
  - integrate all flying around scripts and both types of wiki (phpWiki, TWiki)
  - write article: netfrag.org: what you can do beside the "normal paths", which tools have we prepared for you?
  - put complete www to cvs (is nfo self-hosting then?)
      ---->  no! phpWiki isn't migrated to TWiki yet, so data "is still cluttered" in mysql directly   ;(
  - add some scheduled services (generators, etc.)
  
- microwiki - two files: dmp.php (DiffMergePatch) & microwiki.php

- collect.netfrag.org
  - example application for collecting and merging irc-log-files
  - the cycle is: upload -> compute -> approve/merge
  - the features you have is: advanced browsing (e.g. by type of line, by channel, by nickname, by XYZ)
  - use
    - transactions/router
    - some html-viewing/editing components: view=browse; edit=editSearchPatternDb
    
+ joko: fix mail delivery rules

+ netfrag.org: cvs.netfrag.org url shortcuts

- netfrag.org: system.getMembers (system registry)

- write to jeff zucker, author of SQL::Parser and DBD::CSV when changes are running stable
  (maybe) add File::Analyze to generate rules automatically

- enhance XML::CSV

- look at XML::DBMS

- look at Perl module DBMS

- perl:
  - shells:
      SOAPsh
      psh
      DBI::Shell (dbish?)
      XMLRPCsh
      ysh
  - utils:
      perlcc
      perl2bat
    
      mysqldiff!

- inject_package ;)   (into UML)

- Data::Storage
  - what about calling unknown methods on the main handle???
  - getChildNodes should be able to deliver all tables - or all databases (per context/use)

- mysql-backup-server (all-databases)

- attempt a generic Log::Dispatch::DBI

- put code ConfigPatcher/Handlers.pm to libp.pm somehow
  - think about libp.pm...!?
  
+ pstree -unpal

- mnogosearch on grasshopper

- mkbd.pl
   - win32/linux (for/on any)

- injecttron.pl (querytron.pl)
   - rcp
   - addRegistry
   - run

- sfe - scan for exploit

- bi-directional synchronization 
  - data-row-based (rdbms)
  - data-oo-based (oodbms)
  - file (directory/file-tree)
  - file (diff/patch)

- Don't grep wildly! grep to _determine_
  compare: patch to bw_acct vs. csv line scanner (the - not yet existing - regex based)
  
- Perl: in & out
  - buffering
  - in: read from diamond, stdin, iterating through lines, parsing, select fh
  - out: stdout, stderr, redirecting, tieing, logging, debugging

- constraints!? for DBD::CSV
  - store them either in a seperate file (per directory/db) or somehow encoded inside the header columns
  - e.g. "columnname[PRIMARY KEY, UNIQUE]"
  
- webar (web archiving system)

- services:
    ids (aida, portsentry)
    compartments (freevsd, uml)
    no inetd
    all ports on ssl/tls

- error detection: keywords to scan for: error, warning, unknown, @INC, failed, Can't locate 

- interesting
  Text::CSV_XS
  DBD::File
  DBD::CSV
  SQL::Statement

- William Gibson - Neuromancer ("The Gibsonian vision of cyberspace")

- Neal Stephenson - Snow Crash 

- web based xml editor for http://livingxml.net/

- bochs demo linux (take better one than dlx linux)
   why not use spline?

- write about binary file compatibility
   - compare executables (linux, win, freebsd, macosx, solaris)
   - compare data files
   - solution: byte-code languages and xml-files

- get into Perl XS and/or Perl Inline::C

- unsubscribe from perl mailing lists available via news

- upload progress of Sync.pm

- upload some screenshots of kde-on-cygwin

- tangram and database indices ...

- make lab(s).netfrag.org

- look at Perl's XML-DBMS

- mirror contents from google cache and/or archive.org to have a "safety net" in case of "information loss" ;)

- what about syncing a fixed data structure (e.g. a hash) with a dbms?
  the hash would then have to provide various metadata to be processed...

- for processing large and/or unschemaized data chunks:
  - base it on tangram (why not?)
  - dump it via Data::Dumper (configurable: "condensed")
  - add layer: "packed" (via some gzip handler)

- parameters for a synchronization process:
  - force???
  - erase_destination
  - ...

- collections? what are (e.g.) collections?
  - topics
  - events on timeline
  - places
  - media library items
  - media formats
  - visit virtual directories and link collections:
     - dmoz.org
     - http://lcweb2.loc.gov/ammem/collections/finder.html

- style TWiki like http://www.amk.ca/

- Data::Storage & Berkeley's sleepycat???

- pos - Perl Object Server

- style apache autoindex like http://www.atamai.com/downloads/Atamai-1.0/

----- Revision r1.1 - 26 Nov 2004 - 03:04 - Main.joko