Multi Code

////////////////////////////////////////////////////////////////////////////

🐞Multi Code マルチコード

maXbox Starter 155 – Code in Multicode.

Source: Split a character string based on change of character – Rosetta Code

Script_: Download 1433_charsplit_pas_py_fp_js_uc_test.txt (maXbox5)

Multi-code programming refers to the practice of integrating multiple programming languages, paradigms, or models within a single system or application. This approach is particularly useful in scenarios where different languages or tools excel at specific tasks, allowing developers to leverage their strengths for optimal performance and flexibility.

Also Modern CLIs do more than run multi-commands. They:

  • – Save time with automation
  • – Integrate with cloud and AI tools
  • – Keep your focus in one window
  • – Work the same across Linux, macOS, and Windows

So CLI Tools still matter and I want to show that with one problem from rosettacode and the corresponding 5 solutions:

The advantages of multi-code (or multi-core) are:

  • Optimized Performance: Each language or tool is used where it performs best.
  • Flexibility: Adapts to diverse project requirements.
  • Scalability: Supports complex, large-scale systems.

We combine languages like Python, Pascal, Delphi, Free Pascal and Javascript with the following problem:
“Split a (character) string into comma (plus a blank) delimited strings based on a change of character (left to right)”.
Blanks should be treated as any other character (except they are problematic to display clearly). The same applies to commas. For instance, the string:
gHHH5YY++///\
should be split and show: g, HHH, 5, YY, ++, ///, \

We start with the Pascal solution:

function SplitAtChars(const S: String):String;
var i : integer; lastChar:Char;
begin
  result:= '';
  IF length(s) > 0 then begin
    LastChar:= s[1];
    result:= LastChar;
    For i:= 2 to length(s) do begin
      if s[i] <> lastChar then begin
        lastChar:= s[i];
        result:= result+ ', ';
      end;
      result:= result+ LastChar;
    end;
  end;
end; 

The second, the Delphi one has a form to conigure the output:

const S1 = 'gHHH5YY++///\';
procedure ShowSplitString(Memo: TMemo);
var S2: string;
begin
   Memo.Lines.Add(S1);
   S2:= SplitStringCharChange(S1);
   Memo.Lines.Add(S2);
end;

It works with: Delphi version 6.0 and needs the Library SysUtils and StdCtrls, in maXbox there are precompiled on board.

The third one is Free Pascal and runs on the internet with the help of TIO: The web server of and the arenas (where user code is executed) are currently run on three separate servers. TIO is getting more and more traffic, so additional arenas will be required. Also, server-side permalinks will eventually require a separate storage. With your help, I hope to ensure a smooth operation of all TIO services.

TIO is powered by Digital Ocean
.

The next is Java Script and will also run in a browser but a local based one. With WebView2 in Microsoft Edge, developers can now embed web content directly within native Windows applications. WebView2 offers a simple way to display web-based content using the Chromium engine inside your apps or scripts.
Specifically, WebView2 can run HTML, CSS, and JavaScript inside a standard non-web program, like something written in Visual C++, Delphi or .NET apps.

Support for Edge WebView2 on Windows 7, 8, Server 2012 and 2012R2, ended at Edge WebView2 version 107 in October of 2022 because those operating systems are no longer supported.

procedure JS_Solution;
begin
   with TEdgeViewForm.create(self) do begin
      width:= 1500;
      pagecontrol1.height:= 540;
      icon.loadfromresourcename(hinstance,'XJICON');
      Caption:= 'maXbox5 EdgeView2BrowserFeed_JScriptSolution' ;
      panel1.ParentBackground:= false;
      panel1.color:= clnavy;
      sleep(500);
      navigate('https://maxbox4.wordpress.com/2025/07/01/ekon-29/'); 
      memoJavaScript.text:= JSCRIPT;//JS_InitFunc;
        PageControl1.ActivePageIndex := 1;
        memoHTML.font.size:= 26;
        memoJavaScript.wordwrap:= true;
        memoJavaScript.lines.linebreak:= lf;
      tabsheet3.caption:= 'Result';
      panel3.caption:= 'Source: Result';
      sleep(300)
      ExecuteScript(JSCRIPT); 
      sleep(1600);
      writeln('result of js webview2: '+memoHTML.text);
      writeln('scriptresult of js webview2: '+getScriptresult);
      showmodal;    //save resources
      free;
   end;
end;
pic: with TEdgeViewForm.create(self) do begin
const JSFunc ='function stripchars(string, chars) { '+lf+
              '  return string.replace(RegExp("["+chars+"]","g"), ""); }'+lf+
              'stripchars("She was a soul stripper. She took my heart!","aei")';
              
procedure JS_Solution;
begin
   with TEdgeViewForm.create(self) do begin
      sleep(500)
      ExecuteScript(JSFunc);
      sleep(400)
      //)ExecuteScript('23*4');
      //memoJavaScript.text:= '23*4';
      //ExecuteScript('stripchars("She was a soul stripper. She took my heart!","aei"'); 
      sleep(2200);
      //writeln('result of js webview2: '+memoHTML.text);
      writeln('scriptresult jswebview2: '+getScriptresult);
      //showmodal;    //save resources
      free;
   end;
end;              

The different sleep() is a bit tricky cause you need them to control the async return, otherwise you cant catch the result at the time you expect it. It took some time to execute the script and a wait state is missing. The destination folder can contain the script, or in the script itself as a const or you can invoke scripts as an URL from a server with:

Navigate(‘https://maxbox4.wordpress.com/2025/07/01/ekon-29/jsdemo.js’);

const JSCRIPT =
'(() => {                                                      '+lf+ 
'    "use strict";                                             '+lf+ 
'                                                              '+lf+ 
'    // ----------- SPLIT ON CHARACTER CHANGES ------------    '+lf+ 
'    const main = () =>                                        '+lf+ 
'        group("gHHH5YY++///\\")                               '+lf+ 
'        .map(x => x.join(""))                                 '+lf+ 
'        .join(", ");                                          '+lf+ 

The above is the basic usage of the WebView2 handling. You can customize the parameters and error handling based on specific needs. There’s a tutorial about the topic: Tutorial 129 Using WebView2, October 2024.
The last solution is the Python one procedure:

Const PYFUNC =
  'def splitter(text):  '+lf+
  '  return ", ".join("".join(group) for key, group in groupby(text)) ';

procedure PY_Charsplit_Solution;
begin
 with TPythonEngine.Create(Nil) do begin
    autofinalize:= false;
    loadDLL;
    try
      ExecString('import sys; from itertools import groupby');
      ExecString(PYFUNC);
      addclient(TEngineClient.create(nil));
      execstr('txt = ''gHHH5YY++///\\''  
      # Note backslash is Python escape char.');
      println(evalstr('f''Input: {txt}\n Split: {splitter(txt)}'''));  
      println(evalstr('sys.version'));
    except
      raiseError;
    finally
      unloadDll;
      Free;
    end;  
  end; 
end; 

Conclusion

Multi-code programming is a powerful approach for modern software development, especially in domains requiring high efficiency and adaptability.

Put simply Microsoft EdgeWebView2 is a stripped down version of the Microsoft Edge browser- based on Google Chrome that developers can use to run web code inside of applications you are using. By leveraging the web platform inside desktop scripts, developers can create rich user experiences, improve performance, and utilize modern web capabilities.

The challenges are:

  • Complexity: Managing multiple languages can increase development and debugging effort.
  • Interoperability Issues: Ensuring smooth communication between components.
  • Learning Curve: Requires expertise in multiple programming paradigms.

Facilitates communication between different languages or frameworks, often through APIs, shared memory, or intermediate representations.

References:

How to Use WebView2 in Microsoft Edge Browser – GeeksforGeeks

Try It Online

maxkleiner/P4D-Data-Sciences: A collection of lightweight Python wrappers based on Python4Delphi simplifying Data Sciences development with Delphi

How to use SHFileOperation – Blog – Silicon Cloud

Doc and Tool: maXbox5 – Manage Files at SourceForge.net

Release maXbox V5.2.9 · maxkleiner/maXbox5

Max Kleiner 05/09/2025


Release Notes maXbox 5.2.9.198 September 2025 mX529 beta64-bit🐞


SynEdit upgrade, DisplayFlowControl, Structural Highlighting
GeoMap View3 upgrade, Tutorial3 Modular upgrade folding
SynEdit highlighter updates: JScript, Python, Pascal, C++
WebView2 loader upgrade getScriptresult() of JavaScript
Print & Export Rededesign – menu/File/Print Studio5
maxform1.SystemInfo1Click – menu/Debug/System Info
Modules Count_: 3860 Units maXbox5_29beta190.xml

03.09.2025 17:25
Release Notes maXbox 5.2.9.198 September 2025
SHA1: 5.2.9.198 maXbox5.exe 5a51dfe26c9f57e6c51f47bc9e72625a79646b51
SHA1: ZIP maxbox52.zip AAA6C16CC1311A2C0DA28B212A8FBEB12FD9D1EB

maXbox to VirtualBox
WebView2 Runtime Tasks

MP3 File Duration over Archives

function MP3PlayLength(aMP3FileName:string):string;
var
  wMP: TMediaPlayer;
  wLen: Cardinal;
 begin
   Try
     wMP:= TMediaPlayer.Create(self);
     try
        wMP.Visible:= false;
        wMP.parent:= self;
        wMP.FileName:= aMP3FileName;
        wMP.TimeFormat:= tfMilliseconds;
        wMP.DeviceType:= dtAutoSelect;
        wMP.Open;
        //sleep(50)
        try
           wLen:= trunc(wMP.Length / 1000);
           result:= inttostr(wLen div 60)+':'+inttostr(wLen mod 60);
        finally
           wMP.Close;
        end;
     finally
        wMP.free;
     end;
   except
     result:= '(mpplay err)';
   end;
 end;

We iterate with findfiles() over recursive Dirs and calc the lenght:

try

      seconds:= TimeOfDateTimeToSeconds(strtotime(MP3PlayLength(srlist.strings[it]))/60)

    except writeln(ExceptionToString(ExceptionType, ExceptionParam)); 
            writ(srlist.strings[it]) end;  
    totaltime:= incsecond(totaltime, seconds);

Python Solution:
//python template III  
procedure pymp3_Solution;
 begin
   with TPythonEngine.Create(Nil) do begin
     autofinalize:= false;
     loadDLL;
     try
      ExecString('import datetime,os,sys');
      ExecString('from mutagen.mp3 import MP3');
      ExecString(PYFUNC);
      execstr('path = str(r"'+songpath+'")'   +lf+
              'files = os.listdir(path)'   +lf+
               'total_files_time = 0 '+lf+
               'for f in files:                  '+lf+
               '   if f.endswith(''.mp3''):      '+lf+
               '      filePath = path + ''/'' + f'+lf+
               '      audio = MP3(filePath)      '+lf+
               '      length = audio.info.length '+lf+
               '      total_files_time += length ');
               
      println('pyout: '+evalstr('get_formatted_time(total_files_time)+ " of music!"'));
      println('pyout: '+evalstr('sys.version'));
      addclient(TEngineClient.create(nil));
      writeln('PythonOK '+botostr(PythonOK)+ ' clcount:'+itoa(clientcount));
     except
       raiseError;
     finally
       unloadDll;
       Free;
     end;  
   end;  
 end;  

https://sourceforge.net/projects/maxbox5/files/examples/1437_findallfiles2mp3_64_py.txt/download

Go for Life

Jaro similarity

The Jaro distance is a measure of edit distance between two strings; its inverse, called the Jaro similarity, is a measure of two strings’ similarity: the higher the value, the more similar the strings are. The score is normalized such that   0   equates to no similarities and   1   is an exact match.

https://rosettacode.org/wiki/Jaro_similarity#Pascal

 //converted from C source by /u/bleuge
function ssJaroWinkler(s1, s2: string): double;
var
  l1, l2, match_distance, matches, i, k, trans: integer;
  bs1, bs2: array[1..255] of boolean; //used to avoid getmem, max string length is 255
 begin
  l1:= length(s1);
  l2:= length(s2);
  //fillchar(bs1, sizeof(bs1), 0); //set booleans to false
  for it:= 1 to sizeof(bs1) do bs1[it]:= false;
  //fillchar(bs2, sizeof(bs2), 0);
  for it:= 1 to sizeof(bs2) do bs2[it]:= false;
  if l1 = 0 then
    if l2 = 0 then begin
      result:= 1
      exit; 
    end else begin
      result:= 1
      exit; end; 
  match_distance:= (max(l1, l2) div 2) - 1;
  matches:= 0;
  trans:= 0;
    for i:= 1 to l1 do begin
       for k:= max(1, i - match_distance) to min(i + match_distance, l2) do 
       begin
         if bs2[k] then
           continue;
         if s1[i] <> s2[k] then
           continue;
         bs1[i]:= true;
         bs2[k]:= true;
         inc(matches);
         break;
       end;
    end;
  if matches = 0 then begin
    result:= 0;
    exit end;
  k:= 1;
  for i:= 1 to l1 do begin
    if (bs1[i] = false) then
      continue;
    while (bs2[k] = false) do
      inc(k);
    if s1[i] <> s2[k] then
      inc(trans);
    inc(k);
  end;
  trans:= trans div 2;
  result:= ((matches / l1)+(matches / l2)+((matches - trans)/matches)) / 3;
 end;
https://sourceforge.net/projects/maxbox5/files/examples/1440_Jaro_distance.txt/download
https://sourceforge.net/projects/maxbox5/files/examples/1440_Jaro_distance1.txt/download
maXbox Tracker Test IX
Füssen

Cet ouvrage offre aux passionnés de chemin de fer plus de 500 compositions détaillées des trains Trans Europ Express et assimilés, de 1929 à 2004, couvrant trois quarts de siècle d’histoire ferroviaire européenne, des express aux TGV et ICE. Ouvrage en français et en anglais.

https://laboutiqueducabri.fr/produit/les-compositions-des-tee-et-des-trains-homonymes-tee-and-homonymous-trains-formations/

Dieses Buch sammelt nicht nur die Formationen der 66 TEE-Züge, die von 1957 bis 1995 zirkulieren, sondern auch mögliche Züge mit dem gleichen Namen, ob vor 1957 oder nach 1995. So konnte man neben dem Express und Fasten die deutsche Fernschnellz-ge, die Intercity, dann die EuroCity und schließlich TGV und ICE finden. So haben Modellbauern und andere Eisenbahnfreunde mehr als 500 verschiedene Formationen dieser berühmten Züge zur Verfügung, die sich von Epoche II (1929 für die älteste) bis zur Epoche V (2004 für die letzte) erstrecken, ein Überblick über ein dreiviertel Jahrhundert Eisenbahngeschichte.

Impressum – p.3
Préface – p.4
Foreword – p.5

1 – LES TEE INTERNATIONAUX / INTERNATIONAL TEE TRAINS – p.6-7
1.1 France – Belgique : TEE Memling, Watteau (I), Oiseau Bleu, TEE 112, 123, 130, Brabant, Rubens – p.8-15
1.2 France – Belgique – Pays-Bas : TEE Étoile du Nord, Ile de France (I) – p.16-21
1.3 France – Belgique – Allemagne : TEE Paris Ruhr, Molière, Parsifal – p.22-27
1.4 Allemagne – Belgique : TEE Saphir – p.30-31, TEE Diamant (I) – p.32-33, TEE Albert Schweitzer – p.34-35, TEE Goethe (I) – p.36-37
1.5 France – Allemagne – p.34
1.6 Pays-Bas – Belgique – Luxembourg – France – Suisse : TEE Edelweiss – p.38-39
1.7 Belgique – Luxembourg – France – Suisse : TEE Iris – p.40-41
1.8 France – Suisse : TEE Arbalète – p.44-45
1.9 Allemagne – Suisse : TEE Helvetia – p.48-49
1.10 Allemagne – Suisse – Italie : TEE Roland (I), Mont Cenis, Ligure – p.52-59
1.11 France – Italie : TEE Cisalpin – p.60-61
1.12 France – Suisse – Italie – p.60
1.13 Suisse – France – Espagne : TEE Catalan Talgo – p.64-65
1.14 Suisse – Italie : TEE Gottardo, Ticino, Lemano – p.66-71
1.15 Allemagne – Autriche – Suisse : TEE Bavaria – p.74-75
1.16 Allemagne – Autriche – Italie : TEE Mediolanum – p.76-77
1.17 Allemagne – Autriche : TEE Blauer Enzian (II), Prinz Eugen – p.78-83
1.18 Allemagne – Danemark : TEE Merkur – p.86-87
1.19 Allemagne – Pays-Bas : TEE Rhein Main, Van Beethoven, Rembrandt, Erasmus – p.88-95
1.20 Pays-Bas – Allemagne – Suisse : TEE Rheingold – p.98-99
2 – LES TEE NATIONAUX / NATIONAL TEE TRAINS – p.100-101
2.1 France : TEE Le Mistral, Lyonnais, Rhodanien, Capitole (matin et soir), Aquitaine, Étendard, Kléber, Stanislas, Faidherbe, Gayant, Watteau (II), Jules Verne – p.102-125
2.2 Allemagne : TEE Bacchus, Goethe (II), Roland (II), Diamant (II), Friedrich Schiller, Lufthansa Airport Express, Heinrich Heine – p.126-141
2.3 Italie : TEE Adriatico, Settebello, Colosseum, Ambrosiano, Vesuvio, Cycnus, Aurora – p.142-153

Quelques TEE Rail Alliance – p.154-155
Légendes / Captions – p.156
Compagnies ferroviaires / Railway companies – p.157
Remerciements / Thanks / Dessinateurs – p.158
Bibliographie / Weblinks / Auteur – p.159

Annexe 1 : TEE internationaux – p.160-166
Annexe 2 : TEE nationaux – p.167-169

https://www.menzels-lokschuppen.de/Literatur/Eisenbahn-Buecher/LR-PRESSE-COMPOTEE-Les-compositions-des-TEE.html

Arduino 2.3.6 & maXbox 5.2.9
writeln('EWI_Exc:'+ExceptiontoString(exceptiontype,exceptionparam));
procedure TForm8FormShow(Sender: TObject);
 begin
  // setup cport (already set to 9,600 baud rate)
  try
   if IsCOMPortReal(3) then begin
    ApdComPort1.ComNumber:= 3;
    writeln('debug comport: '+objtostr(apdcomport1)) ;
     //connect up the data-in event
      if (FStatusTrigger = 0) then begin
           FStatusTrigger:= ApdComPort1.AddStatusTrigger(stline); //sline
           ApdComPort1.SetStatusTrigger(FStatusTrigger,
                           lsOverrun or lsParity or lsFraming or lsBreak,
                                       True);
      end;
   end;
   except
     writeln('ArdEWI_Exc:'+ExceptiontoString(exceptiontype,exceptionparam));
   finally
     ApdComPort1.free;
     writ('debug finfree')
   end;  
  // For this sensor we know the data is coming in within this data range
  // so am setting Left Axis Min Max accordingly
  Chart1.Axes.Left.SetMinMax(-1,4);
  //ChartTool1.Value := 2.5;   //colorline as threshold line
 end;

Example of TMS Script

unit fMain;

interface

uses
  Windows, Messages, SysUtils,
  Variants,
  Classes, Graphics, Controls, Forms,
  Dialogs, StdCtrls, atScript, FormScript, atPascal, ScrMemo, Scrmps,
  Vcl.ScripterInit;

type
  TForm1 = class(TForm)
    Button1: TButton;
    log: TListBox;
    countrycombo: TComboBox;
    Label1: TLabel;
    Label2: TLabel;
    Label3: TLabel;
    Label4: TLabel;
    atPascalFormScripter1: TatPascalFormScripter;
    ScrMemo1: TScrMemo;
    ScrPascalMemoStyler1: TScrPascalMemoStyler;
    procedure Button1Click(Sender: TObject);
    procedure StAdd;
    procedure StClear;
  private
    { Private declarations }
  public
    { Public declarations }
  end;

var
  Form1: TForm1;

implementation

{$R *.dfm}

procedure TForm1.Button1Click(Sender: TObject);
begin
  // learns TatPascalScripter how to execute the Add & Clear methods for TStrings
  with atPascalFormScripter1.AddDelphiClass(TStrings) do begin
    AddMethod('Add',1,tkNone,nil,StAdd);
    AddMethod('Clear',0,tkNone,nil,StClear);
  end;
  atPascalFormScripter1.SourceCode.Assign(ScrMemo1.Lines);
  atPascalFormScripter1.Execute(null);
end;


// implementation of component method Add of TStrings
procedure TForm1.StAdd;
begin
  TStrings(atPascalFormScripter1.CurrentObject).Add(atPascalFormScripter1.GetInputArgAsString(0));
end;

// implementation of component method Clear of TStrings
procedure TForm1.StClear;
begin
  TStrings(atPascalFormScripter1.CurrentObject).Clear;
end;

end.

Connect to an ADO ODBC Database

Const CSTRING5= 'Provider=MSDASQL.1;Persist Security Info=False;Data Source=countries64japan;';

function DataBaseConnection_Test(bMessage: Boolean): AnsiString;
var
  asTimeout, asUserName, asPassword, asDataSource, ConnectionString: AnsiString;
  iReturn: Integer; OldCursor: TCursor;
begin
  OldCursor     := Screen.Cursor;
  Screen.Cursor := crHourGlass;
  asTimeout     := '150';
  asUserName    := 'NT_Server'; asPassword:= 'SA';
  asDataSource  := 'SQL Server - My DataBase';

 { ConnectionString := 'Data Source = ' + asDataSource +
    'User ID = ' + asUserName +
    'Password = ' + asPassword +
    'Mode = Read|Write;Connect Timeout = ' + asTimeout;      }

  ConnectionString:= CSTRING5;

  try
    iReturn:= OpenConnection(ConnectionString);
    if (bMessage) then begin
      if (iReturn = 0) then begin
        (Application.MessageBox(utf8toansi('Connection_OK!'), ('Information'), MB_OK))
         writeln('ADO Connection OK!');
      end else if (iReturn = -1) then
        Application.MessageBox(utf8toansi('Connection Error!'), 'Error', MB_ICONERROR + MB_OK);
    end;
    if (iReturn = 0) then
      Result:= ConnectionString
    else if (iReturn = -1) then
      Result:= 'no thing';
  finally
    Screen.Cursor:= OldCursor; //crDefault; //OldCursor;
  end;
end;


function OpenConnection(ConnectionString: AnsiString): Integer;
var ADODBConnection: OleVariant; rs: Olevariant;
    sqlQuery: string;
begin
  ADODBConnection:= CreateOleObject('ADODB.Connection');
  ADODBConnection.CursorLocation:= 3; // User client
  ADODBConnection.ConnectionString:= ConnectionString;
  Result:= 0;
  try
    ADODBConnection.Open;
    sqlQuery:= 'SELECT * FROM Countries'; //Table1';
    rs:= ADODBConnection.Execute(sqlQuery)
    While Not rs.EOF do begin
       writeln(inttostr(rs.Fields('ID').Value)+': '+rs.Fields('Country').Value+
                                #9#9#9#9#9+'  code: '+rs.Fields('Code').Value);
       //writeln((rs.Fields('Country').Value));                           
       rs.MoveNext;
    end;   
  except
    Result:= -1;
  finally
    rs.close;
    ADODBConnection.close;  
  end;
end;

MP3 Stream Shell

We found a solution with Github as a media server. Our URL to MP3 converter stream server serves as an open source for online song downloading or streaming. Either you play in a desktop app or you run it with WebView2 embedding. You can copy the URL of the song you want to download, or paste it to our online MP3 converter website or play it direct as an URL stream in the following script:!

https://sourceforge.net/projects/maxbox5/files/examples/1422_mp3stream_sternenhaufen.txt/download

SongURL5 = ‘https://raw.githubusercontent.com/maxkleiner/maXbox5/main/assets/Sternenhaufen_Max_SunoAI.mp3‘;
Tutor152 =
https://github.com/maxkleiner/maXbox5/blob/main/examples/maxbox_starter152_MP3_Sound_Engine.pdf

var mp3Stream: TMemoryStream;
    mp3Stream:= TMemoryStream.create;
    Http_GetStream(SongURL5, mp3Stream);
    PlayMp3FromStream(mp3stream);
    mp3stream.free;  //}

A raw GitHub link allows you to access the raw content of a file stored in a GitHub repository. This can be useful for directly accessing files such as images, songs, text files, or scripts without the GitHub interface. We use this also to execute a JavaScript link to run in a local embedding browser in maXbox.

Sound Stream Shell

The ComObject CreateOleObject(‘WMPlayer.OCX’) control gives you direct control of media, including reading and writing to the files or metadata/tags.
Additionally, some automation interfaces are implemented by automation objects, and the properties and methods of those interfaces are inherited by the objects like wmp.controls.play.

const SoundCloudURL = 'https://soundcloud.com/max-kleiner-481715070/sternenhaufen';

procedure PlaySurprise_MP3_Media2;
var wmp: Variant; 
 //Maybe you'll be more comfortable with automation.
 //I believe it would provide most of functionality as interfaces provide.
begin
   wmp:= CreateOleObject('WMPlayer.OCX');
   if ISInternetconnected then begin
     wmp.URL:= SongURL5;
     wmp.OpenPlayer(wmp.URL);
   end else  //}
   wmp.OpenPlayer(Exepath+'examples\Sternenhaufen_Max_SunoAI.mp3');
  //wmp.controls.play;
end; 

Conclusion

The SndPlaySound API only supports waveform audio. It is not a general purpose multi-media API and as such does not (directly nor easily) support MP3 playback.
To play your audio through this API you would first need to decode the MP3 into the waveform format that the API expects. With the Comobject CreateOleObject(‘WMPlayer.OCX’) you get an URL enabled player for
streaming an MP3 file.

TResourceStream is a memory stream that provides access to the compiled resources in an application. Use TResourceStream to read the resources of an application. An instance of TResourceStream holds the value of a single resource in a memory buffer where it is accessible to the application.

Weatherbox https://sourceforge.net/projects/maxbox5/files/examples/1415_weather_listview53.txt/download
The Heat Beat
from soundcloud to mp3stream as cloudstream

sr:= GetGeoWeather(‘Rom’, UrlWeatherReport25); 

Edge WebView2 Sternenhaufen
3 Nation Multisystem Locs – DB, B, SNCF – BB 181 212-2 Luxembourg, BB 1608, CC 40110

WeatherApp 5.5

We use 2 Rest API from OpenWeatherMap and mapbox and also WebView2 to call direct an URL for forecasting weather data and render a satellite map:

To see the satellite you have to press Overview.. and then in the browser the Go button:

EKON 29

https://entwickler-konferenz.de/delphi-innovations-fundamentals/statistic-packages-fuer-delphi-oder-python

TEE Capitole – Jouef CC 6526, Roco BB 9278, L.S. Models CC 6513
Jouef HJ2373 CC21001 grisargent,CC21002 Champagnole,HJ2421 21003,HJ2139 21004 et Champagnole

http://lestrainsjouef.free.fr/fr/elec_fra/cc21000z_sncf.html#hj

mX5 TrackTest
ACME Classic Train
Avmz 207 61 80 19-90 589-1
https://www.reisezugwagen.eu/?p=35243

Set di 8 carrozze Trans Europe Express 84/85 Mediolanum

in servizio tra Milano e Monaco di Baviera.

Riproduziuone relativa al convoglio originale di 8 carrozze utilizzato a in composizione orario estivo 1978 set composto da 8 carrozze:

– compartimenti 61 83 18-90 900-8 Az FS

– compartimenti 61 83 18-90 901-6 Az FS

– salone 61 83 18-90 951-1 Az FS

– Ristorante 61 83 88-90 901-1 WRz FS

– bagagliaio generatore 61 83 95-90 900-4 Dz FS

– compartimenti 61 80 19-90 589-1 Avmz DB

– compartimenti 61 80 19-90 526-3 Avmz DB

– compartimenti 61 80 19-70 575-4 Avmz DB

FS – DB curato e ben rifinito negli esterni e nella riproduzione degli interni

https://www.salentorailroad.it/it/acme-55190-set-8-carrozze-fs-tee-84-85-mediolanum-milano-monaco-estivo-1978/
4 Nation Locs – F G CH I – CC 21002 Jouef, Lilliput – HAG 460 083-9 Rivarossi E444 053
Minitrix 12820
CC 6544
Minitrix, Arnold Maurienne CC 6544, CC 6541

The Open Railway Map API

The OpenRailwayMap (ORM) is an OpenStreetMap-based project designed to create a detailed map of the world’s railway infrastructure. It provides a variety of tools, APIs, and map styles to visualize railway data, including tracks, stations, signals, and speed limits.

Key Features and Usage

The OpenRailwayMap database is built on OpenStreetMap data and is available under the Open Database License (ODbL). The rendered map tiles are distri-buted under the CC-BY-SA 2.0 license. The project is non-commercial and main-tained by volunteers, relying on donations for its operation.

The database supports querying railway-related information through a RESTful API. This API allows users to search for stations by name or reference code and retrieve mileage data for specific railway lines. The API documentation is hosted on GitHub.

Map Styles and Tile Access

The map tiles are rendered in Web Mercator projection and are available in different styles, such as:

  • Standard: Displays railway infrastructure like tracks, stations, and switches.
  • Signals: Visualizes railway signals and train protection systems.
  • Maxspeed: Shows maximum speeds and speed signals for railway lines.

Tiles can be accessed via URLs in the format:

http://${s}.tiles.openrailwaymap.org/${style}/${z}/${x}/${y}.png

Here, ${s} can be replaced with subdomains (a, b, or c) for faster loading, and ${style} specifies the map style.

Integration with Tools

The OpenRailwayMap tiles can be integrated into various mapping libraries and tools:

  • OpenLayers and Leaflet: Easily add OpenRailwayMap layers using their respective APIs.
  • OsmAnd: Overlay OpenRailwayMap tiles on offline maps with the “Online maps” plugin.
  • QGIS: Load OpenRailwayMap tiles as XYZ layers for GIS analysis.

The OpenRailwayMap API and tiles are free for non-commercial, small-scale applications. Commercial use requires setting up a private server. Bulk requests and misuse of headers (e.g., faking user-agent) are prohibited. Applications must include proper attribution when using the tiles or API. For example we search for Kamakura Station, Japan:

OpenRailwayMap

Kamakura Station 鎌倉

The project operates without guarantees of availability or support. Users requiring high reliability are encouraged to deploy their own instances of the API or image tile server.

For further details, visit the OpenRailwayMap GitHub repository or the OpenStreetMap Wiki page.

https://wiki.openstreetmap.org/wiki/OpenRailwayMap/API

So first we call the REST-API to get a stations facility information in JSON:

const URL_ORM_GET9 = 
   'https://api.openrailwaymap.org/v2/facility?name=%s&limit=1';
   
function API_GEOLocation_ORM9(AURL, aloc, aApikey: string;
                                             verbose: boolean): Tlatlong;
var Httpreq: THttpRequestC; httpres: string; jsn: TMcJsonItem;
begin
  httpreq:= THttpRequestC.create(self);
  httpreq.headers.add('Accept: application/json; charset=utf-8');
  //httpreq.headers.add('X-Api-Key:'+aAPIkey);
  httpreq.useragent:= USERAGENT5;
  httpreq.SecurityOptions:= [soSsl3,soPct,soIgnoreCertCNInvalid];
  try
    if httpreq.get(Format(AURL,[aloc])) then begin
       httpres:= (httpreq.Response.ContentAsUTF8String)
       writeln('conttype '+httpreq.Response.ContentType);
       if verbose then writ('debug back '+formatJson(httpres));
       jsn:= TMcJsonItem.Create;
       jsn.AsJSON:= httpres;
       writ('debug name: '+jsn.at(0,'name').asstring)
       writ('debug operator: '+jsn.at(0,'operator').asstring)
       result.lat:= jsn.at(0,'latitude').asnumber;  
  //in the api now fixed
       result.long:= jsn.at(0,'longitude').asnumber;
     
       result.descript:= Format('Coords: lat %2.5f lng %2.5f %s
        osm_id: %s operator: %s',
                  [result.lat,result.long,jsn.at(0,'name').asstring,
                                        jsn.at(0,'osm_id').asstring,
                                        jsn.at(0,'operator').asstring]); 
                                      
    end else Writeln('APIError '+inttostr(Httpreq.Response.StatusCode2));
  except 
    writeln('EWI_APIHTTP:
                 '+ExceptiontoString(exceptiontype,exceptionparam));  
  finally 
    writeln('Status3: '+gethttpcod(httpreq.Response.statuscode2))
    httpreq.Free;  
    sleep(200);
    jsn.Free;
  end; 
end;

The API returns JSON formatted data with following fields:

  • latitude: latitude, longitude: longitude
  • osm_id: OSM node ID
  • rank: an importance rank calculated by taking the public transport route relations into account using this station/halt. All OSM tags present on this object. The following tags are very often in use. See the OSM wiki and Taginfo for a more comprehensive list of possible tags.
  • name: name, uic_name: UIC station name
  • railway:ref: reference assigned by the operator of the infrastructure
    railway: type of the facility following Tagging rules), e.g. station, halt, junction, yard.
  • operator: operator of the infrastructure

Open Railway Map:________________________________________________

conttype application/json
debug back [{
“osm_id”: 506122717,
“name”: “鎌倉”,
“railway”: “station”,
“ref”: null, “train”: “yes”,
“name:en”: “Kamakura”,
“name:es”: “Kamakura JR”,
“name:it”: “Kamakura”,
“name:ja”: “鎌倉”,
“name:ko”: “가마쿠라”,
“name:zh”: “镰仓”,
“operator”: “東日本旅客鉄道”,
“wikidata”: “Q932895”,
“wikipedia”: “ja:鎌倉駅”,
“name:ja-Hira”: “かまくら”,
“name:ja-Latn”: “Kamakura”,
“public_transport”: “station”,
“latitude”: 35.31911869967492,
“longitude”: 139.5504286,
“rank”: 12
}]
debug name: \u938c\u5009
debug operator: \u6771\u65e5\u672c\u65c5\u5ba2\u9244\u9053

Status3: SC_OK Coords: lat 35.31912 lng 139.55043 鎌倉 osm_id: 506122717 operator: 東日本旅客鉄道
# mX5🐞 executed: 02/11/2025 08:48:43 Runtime: 0:4.305 Memload: 64% use

 got also a mail from a bugfix which is now solved1. Thank you for the bug report and the comment reminding me. I adapted the frontend JavaScript code as well because it assumed latitude and longitude to be swapped as well. It seems that the old API had this bug, too. 🫢

The call for the map above has to be set with the coordinates, which we got from the facility API:

OpenWeb(‘https://www.openrailwaymap.org/?style=standard&lang=en&lat=’+
flots(togeo.lat)+’&lon=’+flots(togeo.long)+’&zoom=14′);

Most of the UTF-8 we do de-serialize with the function jsonunescape():
writ(jsonunescape((togeo.descript),#13#10));

>>> Coords: lat 35.31912 lng 139.55043 鎌倉 osm_id: 506122717 operator: 東日本旅客鉄道

A string is a sequence of zero or more Unicode characters [UNICODE]. An object is an unordered collection of zero or more name/value pairs, where a name is a string and a value is a string, number, boolean, null, object, or array. An array is an ordered sequence of zero or more values.

Press enter or click to view image in full size

OpenWeb(‘https://www.latlong.net/c/?lat=’+flots(togeo.lat)+’&long=’+flots(togeo.long));
OpenWeb('https://www.openrailwaymap.org/?style=standard&lang=en&lat='+
                    flots(togeo.lat)+'&lon='+flots(togeo.long)+'&zoom=14');

Most API requests should normally take no longer than 3 seconds. In any case, you should write your application so that it will time out requests after ca. 5 seconds. Applications must send a unique HTTP User-Agent. If map tiles are embedded into a website, browsers must send a valid HTTP referer instead.

1https://github.com/OpenRailwayMap/OpenRailwayMap-api/issues/6

Max Kleiner, Text, Code & Photos, November 2025

Appendix, Source and Links

Ref: OpenRailwayMap/API — OpenStreetMap Wiki

https://www.openrailwaymap.org/

Japan 2025 — Breitschblog

https://wiki.openstreetmap.org/wiki/OpenRailwayMap/API

Script: https://github.com/maxkleiner/OpenRailwayMap-api/blob/master/tests/1444_OpenRailMap_Geolocation_distance14_uc_py.txt

ACME E.646.185
FS, Elektrolokomotive der Reihe E.656, 5. Bauserie, blau/grau-Farbgebung, „50. Jubiläumsausgabe
https://www.arte.tv/de/videos/122692-000-A/der-trans-europ-express-tee/
https://www.drehscheibe-online.de/foren/read.php?017,8148759
Minitrix, Arnold Maurienne CC 6544, CC 6541
Arnold CC 6502 Mistral
Roco CC 6553 #78622
DB FM 111 205-1,Mä 110 116-1,Mä 120 159-9
FM-Märklin-HAG-Roco
AKE Rheingold 2025

Geocoding Distance🐞

maXbox Starter 142 – Get a GEO distance.

“Natura abhorret vacuum1 – Spinoza.

Source: 1390_Sphenic_Numbers2TIO_12_py_uc.txt
1397_Geolocation_distance12_uc.txt

Nominatim can power the search box on your website, allowing your users to type free-form queries (“Cafe Paris, Cologne”) in any language. It also offers a structured query mode (“postcode=12345”, “city=London”, “type=cafe”) that helps you to automate geocoding of extensive address lists.

https://sourceforge.net/projects/maxbox5/files/examples/1274_GoogleMapsAPI_General1.pas/download

Sometimes life is like playing with us: You write some useful code that solves a problem, and then someone comes along and makes the problem much harder. Here’s how to continuously integrate new solutions without having to rewrite your old solutions (as much). Means you don’t have to change the language, you change the environment.

Let’s start with a simple problem: Get the distance between two earth-coordinates. Calculating the geographical distance between two points on the Earth’s surface can be done using the Haversine formula. This formula accounts for the spherical shape of the Earth. Below is a Delphi and Python function that performs this calculation:

type
 TipGeolocation = record
     Latitude: Double; //read Coordinates[0] write SetLatitude;
     Longitude: Double; //read Coordinates[1] write SetLongitude;
  end;

For the purposes of this task we make the problem harder, a geocoding function with a record pair of coordinates:

const
    EARTHS_RADIUS_IN_METERS = 6378137;  
 
 
function TipGeolocationDistanceInMetersTo(
                     const AGeolocation: TipGeolocation;
                           Latitude, Longitude: Double): Double;
var LDeltaLat, LDeltaLong, LA: Double;
begin
  LDeltaLat:= DegToRad(AGeolocation.Latitude - Latitude);
  LDeltaLong:= DegToRad(AGeolocation.Longitude - Longitude);
  LA:= Sin(LDeltaLat / 2) * Sin(LDeltaLat / 2) + Cos(DegToRad(Latitude)) * 
       Cos(DegToRad(AGeolocation.Latitude)) * Sin(LDeltaLong / 2) * Sin(LDeltaLong / 2);
  Result:= Abs(EARTHS_RADIUS_IN_METERS * 2 * ArcTan2(Sqrt(LA), Sqrt(1 - LA)));
end;

This function calculates the distance between two points given their latitude and longitude in degrees. The example usage calculates the distance between Bern and Paris. Feel free to adapt the coordinates to your specific needs!

Geoloc_setup;    //reference data2
writeln('Distance Bern,CH - Paris,F: '+
         format('%2.4f m ',[TipGeolocationDistanceInMetersTo(tipgeoloc1,lat2,lon2)]));                           

And in Python: import math

def haversine_distance(lat1, lon1, lat2, lon2):
    # Radius of the Earth in kilometers
    R = 6371.0
    
    # Convert latitude and longitude from degrees to radians
    lat1_rad = math.radians(lat1)
    lon1_rad = math.radians(lon1)
    lat2_rad = math.radians(lat2)
    lon2_rad = math.radians(lon2)
    
    # Differences in coordinates
    dlat = lat2_rad - lat1_rad
    dlon = lon2_rad - lon1_rad
    
    # Haversine formula
    a = math.sin(dlat / 2)**2 + math.cos(lat1_rad) *math.cos(lat2_rad) * math.sin(dlon / 2)**2                                                  
    c = 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
    
    # Distance in kilometers
    distance = R * c
    return distance

# Example usage
lat1 = 46.94809  # Latitude of Bern
lon1 = 7.44744   # Longitude of Bern
lat2 = 48.8566   # Latitude of Paris
lon2 = 2.3522    # Longitude of Paris

distance = haversine_distance(lat1, lon1, lat2, lon2)
print(f"The distance between Bern and Paris is {distance:.2f} kilometers.")
>>> The distance between Bern and Paris is 434.96 kilometers.

Often, we calculate the great-circle distance of given two points in geophysics, for example, the distance from the seismic station to the epicenter. Supposing that we have two geographical points, and let’s say P1:(ϕ1,λ1)and P2:(ϕ2,λ2), where,ϕ and λ represent the latitude and longitude, respectively. Using Haversine formula, we can calculate the great-circle distance.
The great-circle distance, orthodromic distance, or spherical distance is the distance between two points on a sphere, measured along the great-circle arc between them.

Paris Place Madeleine Place de la Madeleine

Next we need a geocoding API. Geocoding is the process of transformation of any location name into geographical coordinates, and the other way around (reverse geocoding). OpenWeather’s Geocoding API and OpenStreetMap supports both the direct and reverse methods, working at the level of city names, areas and districts, countries and states:

Test the link above then you get:

Its not a unique place, different results as JSON possible

Take the result with the most importance after place_rank like 0.50962:

const URL_GEOLOCURL9 = 'https://nominatim.openstreetmap.org/search?format=json&q=%s';
      URL_APILAY_GEO = 'https://api.apilayer.com/geo/country/capital/%s';  
      
var fromgeo, togeo: Tlatlong;      
 
function API_GEOLocation_OSM9(AURL, aloc, aApikey: string;
                                             verbose: boolean): Tlatlong;
var Httpreq: THttpRequestC; httpres: string;
    jsn: TMcJsonItem;
begin
  httpreq:= THttpRequestC.create(self);
  httpreq.headers.add('Accept: application/json; charset=utf-8');
  //httpreq.headers.add('X-Api-Key:'+aAPIkey);
  httpreq.useragent:= USERAGENT5;
  httpreq.SecurityOptions:= [soSsl3, soPct, soIgnoreCertCNInvalid];
  try
    if httpreq.get(Format(AURL,[aloc])) then begin
       httpres:= (httpreq.Response.ContentAsUTF8String)
       writeln('conttype '+httpreq.Response.ContentType);
       if verbose then writ('debug back '+formatJson(httpres));
       jsn:= TMcJsonItem.Create;
       jsn.AsJSON:= httpres;
       result.lat:= jsn.at(0,'lat').asnumber;
       result.long:= jsn.at(0,'lon').asnumber;
       result.descript:= Format('Coords: lat %2.5f lng %2.5f %s osm_id: %s ',
                                [result.lat,result.long,jsn.at(0,'name').asstring,
                                                        jsn.at(0,'osm_id').asstring]); 
    end else Writeln('APIError '+inttostr(Httpreq.Response.StatusCode2));
  except 
    writeln('EWI_APIHTTP: '+ExceptiontoString(exceptiontype,exceptionparam));  
  finally 
    writeln('Status3: '+gethttpcod(httpreq.Response.statuscode2))
    httpreq.Free;  
    sleep(200);
    jsn.Free;
  end; 
end;

const

URL_GEOLOCURL9 = ‘https://nominatim.openstreetmap.org/search?format=json&q=%s‘;

URL_APILAY_GEO = ‘https://api.apilayer.com/geo/country/capital/%s‘;

Test it by calling the const URL below, for example:

 Nominatim uses OpenStreetMap data to find locations on Earth by name and address (geocoding) as you can see in the const URL_GEOLOCURL9. It can also do the reverse, find an address for any location on the planet. The Geocoding API relies on a redundant infrastructure of geocoder nodes powered by the Nominatim software with edge servers for fast delivery of results.
Note that when you get an

APIError 403

Status3: SC_FORBIDDEN

Status3: SC_FORBIDDEN

Exception: Invalid pointer operation at 875.1746

or an

EWI_HTTP: Exception: The request has timed out.

Status3:

Null Pointer Exception at 929.1876

then you are not following the Nominatim Usage policy, see here: https://operations.osmfoundation.org/policies/nominatim/

You’ll have to be esp. careful about including a unique user agent for your application and change from time to time;

It sounds like you’re encountering a “403 Forbidden” error when trying to use the Nominatim API. This error typically indicates that your request is being blocked, possibly due to rate limiting, incorrect usage, or IP blocking. Here are a few steps you can take to troubleshoot and resolve this issue:

Check API Usage Limits: Ensure that you are not exceeding the usage limits set by Nominatim. They have strict policies to prevent abuse. Review their usage policy to make sure you’re compliant.

User-Agent Header: Nominatim requires a valid User-Agent header in your requests. Make sure you include a descriptive User-Agent string that identifies your application. Best is do rotate useragents:

const
  USERAGENT5 = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like
                                                                              Gecko)'+
                   ' Chrome/126.0.0.0 Safari/537.36 Edg/126.0.0.0 '; 
  USERAGENT2 = 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1';  
  USERAGENT3 = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like
                Gecko)'+' Chrome/125.0.0.0 Safari/537.3';                // chrome + win   
  USERAGENT4 = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15'+
                   ' (KHTML, like Gecko) Version/17.4.1 Safari/605.1.1'; // safari + mac
  USERAGENT1 = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 '+    
                   '(KHTML, like Gecko) Chrome/126.0.0.0 Edg/126.0.0.';  // edge + win

const useragents

softwareschule.ch/examples/geodistance1.htm

Most of the time, ~ 75% in this case, it is spent with API IP-connection. So we put in maXbox processmessagesOFF; and processmessagesON; between start and stop-time for time measure.

The interesting point is to know where the code is running and how it is stored in an executable or script itself. Solution 1 and 2 can running on the web or on premise all others run locally or on a server.

3 Solutions Overview of GEO Distance Math Solver

  • Internal scripted TipGeolocationDistanceInMetersTo() in maXbox5
  • External def haversine_distance () call of Python for Delphi(P4D)
  • Internal compiled function SphericalEarthDistance()
PastPresentknown
Big BangFutureunknown
unchangeable changeableTime/Space

The multiplication of past times future is a vector with the function: known = f(changeable) [y=f(x)] as distance over time, so distance is a function of time: d=f(t)

Time Space Matrix States

Conclusion

When it comes to problem-solving, there are often multiple solutions that can be used to solve the same problem. The choice of solution depends on various factors such as performance, storage, implementation, simplicity, and also scaleability and security in different environments. The code is more or less the same but the choice of the environment (script, executable, container, hosting, web or cloud API) could be a response of different requirements.

Great Circle Mapper is a tool that shows the shortest route between two or more locations on a globe. You can enter airports, cities, or coordinates and get a distance back with a geocode API.

Script:

https://sourceforge.net/projects/maxbox5/files/examples/1397_Geolocation_distance12_uc.txt/download

References:
Home – Geocoding API Documentation

Doc and Tool: maXbox5 – Manage Files at SourceForge.net

As PDF: http://www.softwareschule.ch/download/maxbox_starter142.pdf

Max Kleiner 23/04/2025

1 Nature recoils from emptiness –

Garmin GPS Distance Tracking
TGV Inoui 543-550-540 Paris Gare d’Est Ostern 2025
Published on Page 10, Geocoding Distance

From our technical advisor, Jerry King From your Editor

Why ‘OPEN SOURCE’ AI isn’t truly Open By Stefano Maffuli

Geocoding Distance By Max Kleiner

Alea Iacta Est, the die is cast By Reinhold Schmitt

Programming the QUEENS puzzle By David Dirkse

Threading in the Browser Understanding how to create a WEBASSEMBLY APPLICATION

By Michael van Canneyt

ORMs for various platforms A blog created by Kim Madsen (kbmMW)

Quantum Network Explorer By Prof.Dr. Stephanie Wehner

Explaining the extended RTTI in PAS2JS By Michael van Canneyt

CC 21004 by Jouef
Polytension Locs CC 40109, BB 1606, Br 184 001-6
Polytension Locs II 3 Nations
Piko 5420

7 Data Science Statistical Methods

Data science is a multidisciplinary field that combines math, statistics, computer science, machine learning, and domain expertise to extract insights from data. While data science algorithms often put the spotlight, a solid foundation in statistical methods can be just as pivotal.

The code in script is build on Python especially Python4Delphi as P4D.

1. Bayesian Inference

Bayesian inference uses Bayes’ theorem to update the probability of a hypothesis as more evidence or information becomes available. 

Bayesian statistics offers a robust and flexible framework for understanding how beliefs should be updated in light of new evidence. This approach stands in contrast to classical statistics,

import pymc as pm
import numpy as np 
//# Suppose we observed 20 coin flips with 12 heads and 8 tails
execstr('observ_heads = 12; observ_tails = 8');
      
execstr('with pm.Model() as model:  '+                       
          '# Prior for the bias of the coin (theta)                '+LF+
          'theta = pm.Beta(''theta'', alpha=1, beta=1)             '+LF+
          '# Likelihood                                            '+LF+
          'y= pm.Binomial(''y'',n=observ_heads+observ_tails,p=theta,observed=observ_heads)'+
          '# Posterior sampling                                    '+LF+
          'trace = pm.sample(200, tune=1000, cores=1, chains=2)   ');
          
execstr('pm.summary(trace)');

We perform Bayesian parameter estimation for a Bernoulli process (e.g., coin flips).

2. Hypothesis Testing (t-test)

Hypothesis testing involves formulating a null hypothesis (no difference/effect) and an alternative hypothesis. A t-test specifically checks if the means of two groups are significantly different.

execstr('from scipy.stats import norm');
execstr('from scipy.stats import ttest_ind')
     //# Synthetic data
execstr('group_A= np.random.normal(5,1,50); group_B= np.random.normal(5.5,1.2,50)');
execstr('stat, pvalue = ttest_ind(group_A, group_B)');
      
execstr('print(f"T-statistic: {stat:.2f}, p-value: {pvalue:.4f}")'+LF+
      'if pvalue < 0.05:                                                     '+LF+
      '    print("Reject the null hypothesis (Significant difference).")     '+LF+
      'else:                                                                 '+LF+
      '    print("Fail to reject null hypothesis (No significant difference).")');

test is a statistical test that is used to compare the means of two groups. It is often used in hypothesis testing to determine whether a process or treatment actually has an effect on the population of interest, or whether two groups are different from one another.

3. Factor Analysis (Loading Analysis)

Oh thats a real big topic in statistics. Factor Analysis models the observed variables as linear combinations of latent (unobserved) factors, often used for dimensionality reduction or to uncover hidden structure.

Factor Analysis is a statistical method used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors1. This technique helps in reducing the number of variables by identifying a smaller number of underlying factors that explain the correlations among the observed variables2.

Key Principles

Factors: In factor analysis, a factor refers to an underlying, unobserved variable or latent construct that represents a common source of variation among a set of observed variables1. These observed variables are measurable and directly observed in a study.

Factor Loadings: Factor loadings represent the correlations between the observed variables and the underlying factors. They indicate the strength and direction of the relationship between each variable and each factor1.

//!pip install factor_analyzer
execstr('from factor_analyzer import FactorAnalyzer');
     //# Synthetic data (100 samples, 6 variables)
execstr('X = np.random.rand(100, 6)                              '+LF+ 
         'fa = FactorAnalyzer(n_factors=2, rotation=''varimax'')  '+LF+ 
         'fa.fit(X)                                               '+LF+ 
         'print("Loadings:\n", fa.loadings_)                       '); 

Factor Analysis is a method for modeling observed variables, and their covariance structure, in terms of a smaller number of underlying unobservable (latent) “factors.” The factors typically are viewed as broad concepts or ideas that may describe an observed phenomenon. For example, a basic desire of obtaining a certain social level might explain most consumption behavior.

4. Cluster Analysis (K-means) 

Clustering partitions data into homogeneous groups (clusters) based on similarity. K-means is a popular centroid-based clustering technique.

execstr('from sklearn.cluster import KMeans ');
     //# Synthetic data: 200 samples, 2D
execstr('X = np.random.rand(200, 2)');
execstr('kmeans = KMeans(n_clusters=3, random_state=42)');
execstr('kmeans.fit(X)');
execstr('print("Cluster centers:", kmeans.cluster_centers_)');
execstr('print("Cluster labels:", kmeans.labels_[:10])');

Cluster centers: [[0.48831729 0.1946909 ]
[0.20469491 0.71104817]
[0.74510634 0.71980305]]
Cluster labels: [1 1 2 0 2 0 0 2 1 1]

K-means cluster analysis is an iterative, unsupervised learning algorithm used to partition a dataset into a predefined number of clusters (k).

5. Logistic Regression Classifier

Logistic regression is used for binary classification, modeling the probability of a certain class or event existing.

execstr('from sklearn.linear_model import LogisticRegression');
execstr('from sklearn.linear_model import LinearRegression');
execstr('from sklearn.metrics import mean_squared_error');
execstr('from sklearn.model_selection import train_test_split');
execstr('np.random.seed(50)                                                '+LF+
         'X = np.random.rand(100, 5)                                        '+LF+
         'y = np.random.randint(0, 2, 100)  # Binary labels                 '+LF+
             'X_train,X_test,y_train,y_test=train_test_split(X,y,test_size=0.2) '+LF+
         'model = LogisticRegression()                                      '+LF+
         'model.fit(X_train, y_train)                                       '+LF+
         //# Predicting the target values for the test set
         //'y_pred = model.predict(X_test)                                   '+LF+
         '# Calculating the model score (R^2 score)                         '+LF+
         'accuracy = model.score(X_test, y_test)                            '+LF+
         'print(y_test)                                                     '+LF+
         'print(f"R^2 Score: {accuracy}")                                   '+LF+
         'print("Class Accuracy:", accuracy)                                ');
  
execstr('y_pred = model.predict(X_test)  '+LF+       
        //# Calculating the Mean Squared Error
         'mse = mean_squared_error(y_test, y_pred)' +LF+
         'print(f"Mean Squared Error: {mse}") '+LF+
         'print(y_test)'+LF+
         'print(y_pred)  ');  
     
//https://www.geeksforgeeks.org/difference-between-score-and-accuracy_score-methods-in-scikit-learn/
execstr('from sklearn.metrics import accuracy_score');       
execstr('# Method 3: Using accuracy_score function                 '+LF+
         'accuracy = accuracy_score(y_test, y_pred)                 '+LF+
         'print(f"accuracy_score function - Accuracy: {accuracy}")  ');        
     

[1 0 0 1 0 0 1 0 1 1 0 1 0 1 1 1 1 1 1 1]
R^2 Score: 0.7
Class Accuracy: 0.7
Mean Squared Error: 0.3
[1 0 0 1 0 0 1 0 1 1 0 1 0 1 1 1 1 1 1 1]
[1 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 0 1 1 1]
accuracy_score function – Accuracy: 0.7

This code snippet trains a linear regression model, predicts the target values for the 
test setand then calculates and prints the R² score and Mean Squared Error (MSE) for the model.  Feel free to adapt it to your specific dataset and model!      
Scikit-learns model.score(X,y) calculation works on co-efficient of determination i.e R^2 is a simple function that takes model.score= (X_test,y_test). It doesn’t require y_predicted value to be supplied externally to calculate the score for you, rather it calculates y_predicted internally  and uses it in the calculations.

This is how it is done:

u = ((y_test – y_predicted) ** 2).sum()

v = ((y_test – y_test.mean()) ** 2).sum()

score = 1 – (u/v)

and you get the score ! 

[1 0 0 1 0 0 1 0 1 1 0 1 0 1 1 1 1 1 1 1] real
[1 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 0 1 1 1] predict
accuracy_score function – Accuracy: 0.7

We have 20 samples to compare (test_size=0.2 of 100) , we got 14 right predictions, that means 70 % of 20 samples (14/0.2=70) just so the score 0.7 aka 70 %!

https://sourceforge.net/projects/maxbox5/files/examples/1380_statistical_data_science1_pas_py_uc.txt/download

6. Monte Carlo Simulation     

Monte Carlo simulations use repeated random sampling to estimate the probability of different outcomes under uncertainty.

This interactive simulation estimates the value of the fundamental constant, pi (π), by drawing lots of random points to estimate the relative areas of a square and an inscribed circle.

execstr('np.random.seed(42)                                     '+LF+
        'n_samples = 10_000_00                                   '+LF+
        'xs = np.random.rand(n_samples)                          '+LF+
        'ys = np.random.rand(n_samples)                          '+LF+
        '# Points within the unit circle                         '+LF+
        'inside_circle = (xs**2 + ys**2) <= 1.0                  '+LF+
        'pi_estimate = inside_circle.sum() * 4 / n_samples       ');  
execstr('print("Estimated π:", pi_estimate)');

Estimated p: 3.141864

7. Time Series Analysis (ARIMA)

ARIMA (AutoRegressive Integrated Moving Average) is a popular model for forecasting univariate time series data by capturing autocorrelation in the data.

In time series analysis used in statistics and econometrics, autoregressive integrated moving average (ARIMA) and seasonal ARIMA (SARIMA) models are generalizations of the autoregressive moving average (ARMA) model to non-stationary series and periodic variation, respectively. All these models are fitted to time series in order to better understand it and predict future values.

pip install statsmodels

C:\maxbox\maxbox4\maxbox5>py -0

 -V:3.12 *        Python 3.12 (64-bit)

 -V:3.11          Python 3.11 (64-bit)

 -V:3.11-32       Python 3.11 (32-bit)

 -V:3.10-32       Python 3.10 (32-bit)

 -V:3.8           Python 3.8 (64-bit)

execstr('from statsmodels.tsa.arima.model import ARIMA ');
     //# Synthetic time series data
execstr('np.random.seed(42);  data = np.random.normal(100, 5, 50)');
execstr('time_series = pd.Series(data)');
     //# Fit ARIMA model (p=1, d=1, q=1)
execstr('model = ARIMA(time_series, order=(1,1,1))');
execstr('model_fit = model.fit()');
     //# Forecast next 5 points
execstr('forecast = model_fit.forecast(steps=5)');
execstr('print("Forecast:", forecast.values)');

Forecast: [98.26367322 98.50344679 98.51156834 98.51184343 98.51185274]

From understanding Bayesian inference and Cluster, through advanced concepts like Logistic Regression or LinearRegrison and ARIMA, these 7 advanced statistical approaches form a comprehensive and useful toolkit for any data scientist.

Most of the ideas has the source of: https://medium.com/@sarowar.saurav10/20-advanced-statistical-approaches-every-data-scientist-should-know-ccc70ae4df28

4 Nations Locs RENFE Ibertren–BB16007– OeBB Roco1042–NS1104
4 Nations Locs RENFE Ibertren–Roce BB16007– OeBB Roco 1042–Piko NS 1157

6 Nations Locs Ibertren-Jouef-HAG-Trix-Roco-Piko

6 Nations Locs Ibertren-Jouef-HAG-Trix-Roco-Piko

5 Important Diagram Types

Here, I’ll show you how to analyze a runtime created dataset and extract meaningful insights with 4 diagram types:

  1. Bar Chart
  2. Scatter Plot
  3. Histogram
  4. Box Plot
  5. Correlation Matrix

For this project, we’ll create a dataset, clean it, filter out the data, and create meaningful visualizations with those 4 types.

Runtime create Data

http://www.softwareschule.ch/examples/pydemo91.htm

https://sourceforge.net/projects/maxbox5/files/examples/1380_statistical_data_analysis2_pas_py_uc.txt/download

First, let’s import the necessary libraries and load our employee dataset:

# Import libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns

# Create Employee Dataset
data = {
    'Employee_ID': range(1001, 1011),
    'Name': ['Alice','Bob','Charlie','David','Emma','Frank', 'Grace','Helen','Isaac','Julia'],
    'Age': [25, 28, 35, 40, 22, 30, 45, 50, 29, 38],
    'Department': ['HR','IT','IT','Finance','HR','Finance','IT', 'Marketing','HR','Finance'],
    'Salary': [50000, 70000, 85000, 92000, 48000, 78000, 110000, 65000, 52000, 88000],
    'Experience_Years': [2, 4, 10, 15, 1, 8, 20, 12, 3, 11],
    'Performance_Score': [3.2, 4.5, 4.8, 3.7, 2.9, 4.2, 4.9, 3.8, 3.5, 4.1]
}

# Convert to DataFrame
df = pd.DataFrame(data)

# Display first few rows
print(df.head())

This we can transpile in maXbox with Python for Delphi:

//# Create Employee Dataset
execstr('data = { '+LF+
        '"Employee_ID": range(1001, 1011), '+LF+
        '"Name":["Alice","Bob","Charlie","David","Emma","Max","Grace","Helen","Isaac","Julia"],'+LF+
        '"Age":[25, 28, 35, 40, 22, 30, 45, 50, 29, 38], '+LF+
        '"Department":["HR","IT","IT","Finance","HR","Finance","IT","Marketing","HR","Finance"],'+LF+
        '"Salary":[50000,70000,85000,92000,48000,78000,110000,65000,52000,88000],'+LF+
        '"Experience_Years":[2, 4, 10, 15, 1, 8, 20, 12, 3, 11],'+LF+
        '"Performance_Score":[3.2,4.5,4.8,3.7,2.9,4.2,4.9,3.8,3.5,4.1]'+LF+
        '} ');

//# Convert to DataFrame
execstr('df = pd.DataFrame(data)');

//# Display first few rows
execstr('print(df.head())');
      
//Data Cleaning    # Check for missing values
execstr('print(df.isnull().sum());  # Check data types print(df.dtypes)');

//# Convert categorical columns to category type
execstr('df[''Department''] = df[''Department''].astype(''category'')');

//# Add an Experience Level column
execstr('df[''Experience_Level''] = pd.cut(df[''Experience_Years''],'+LF+
               'bins=[0,5,10,20], labels=[''Junior'',''Mid'',''Senior''])');

//# Show the updated DataFrame
execstr('print(df.head())'); 
     
//Find Employees with High Salaries
execstr('high_salary_df = df[df[''Salary''] > 80000]');
execstr('print(high_salary_df)');
//Find Average Salary by Department
execstr('print(df.groupby(''Department'')[''Salary''].mean())');
//Find the Highest Performing Department
execstr('print(f"Highest Performing Department: {df.groupby("Department")["Performance_Score"].mean().idxmax()}")');   
     

Now, we create meaningful visualizations using Matplotlib & Seaborn modules:

//Step 4: Data Visualization
//📊 1. Bar Chart — Average Salary by Department
     execstr('import matplotlib.pyplot as plt');
     execstr('import seaborn as sns');
     execstr('plt.figure(figsize=(8,5))'+LF+
      'sns.barplot(x=df[''Department''],y=df[''Salary''],estimator=np.mean,palette="coolwarm")'+LF+
      'plt.title(''Average Salary by Department'', fontsize=14)                               '+LF+
      'plt.xlabel(''Department'', fontsize=12)                                                '+LF+
      'plt.ylabel(''Average Salary'', fontsize=12)                                            '+LF+
      'plt.xticks(rotation=45)                                                                '+LF+
      'plt.show()                                                                             '); 
      
//📈 2. Scatter Plot — Salary vs Experience   
    execstr('plt.figure(figsize=(9,5))'+LF+
      'sns.scatterplot(x=df["Experience_Years"],y=df["Salary"],hue=df["Department"],palette="Dark2",s=100)'+LF+
      'plt.title(''Salary vs Experience'', fontsize=14)                               '+LF+
      'plt.xlabel(''Years of Experience'', fontsize=12)                               '+LF+
      'plt.ylabel(''Salary'', fontsize=12)                                            '+LF+
      'plt.legend(title="Department",bbox_to_anchor=(1, 1),fontsize=8)               '+LF+
      'plt.show()                                                                 '); 
      
//📊 3. Histogram — Salary Distribution
    execstr('plt.figure(figsize=(8,5))                                            '+LF+
      'plt.hist(df["Salary"], bins=5, color="blue", alpha=0.7, edgecolor="black")  '+LF+ 
      'plt.title("Salary Distribution", fontsize=14)                               '+LF+ 
      'plt.xlabel("Salary", fontsize=12)                                           '+LF+ 
      'plt.ylabel("Frequency", fontsize=12)                                        '+LF+ 
      'plt.show()                                                                  ');   
      
//📊 4. Box Plot — Salary by Department
    execstr('plt.figure(figsize=(8,5))                                    '+LF+
      'sns.boxplot(x=df["Department"], y=df["Salary"], palette="pastel")  '+LF+
      'plt.title("Salary Distribution by Department", fontsize=14)        '+LF+
      'plt.xlabel("Department", fontsize=12)                              '+LF+
      'plt.ylabel("Salary", fontsize=12)                                  '+LF+
      'plt.xticks(rotation=45)                                            '+LF+
      'plt.show()                                                         ');  
    

And the graphical result will be:

To go further, try working with larger datasets, dive into more advanced Pandas functions, or explore machine learning with Scikit-learn like above with statistical methods.

Correlation Matrix

Checking for correlation, and quantifying correlation is one of the key steps during 
        exploratory data analysis and forming hypotheses.

 //Start with Tutor 140       
//# Target column is under ch.target, the rest is under ch.data
execstr('ch = fetch_california_housing(as_frame=True)');
execstr('df = pd.DataFrame(data=ch.data, columns=ch.feature_names)');
execstr('df["MedHouseVal"] = ch.target; print(df.head())');
         
execstr('fig, ax = plt.subplots(figsize=(10, 6))             '+LF+
        'plt.title("corr. matrix - california_housing mX5")  '+LF+   
        'sns.heatmap(df.corr(),ax=ax,annot=True); plt.show() '); 

Several types of visualizations are commonly used in EDA using Python, including:

  • Bar charts: Used to show comparisons between different categories.
  • Line charts: Used to show trends over time or across different categories.
  • Pie charts: Used to show proportions or percentages of different categories.
  • Histograms: Used to show the distribution of a single variable.
  • Heatmaps: Used to show the correlation between different variables.
  • Scatter plots: Used to show the relationship between two continuous variables.
  • Box plots: Used to show the distribution of a variable and identify outliers.
  • Correlation Matrix shows relations to each other variable
Chasseral 1606m
3 Nation Locs
4 Nation RAMs – DB-FS-SBB-SNCF TEE
4 Swiss Machines: RAm 501,RAe 1051,Re4/4II11252,Re4/410033

Import Delphi Double Arrays to Python Numpy Arrays and show Correlation Matrix

First we have to create data features:

type DMatrix = array of array of double;

procedure TForm1DataCreate(Sender: TObject);
var i,j,tz: integer;
  //mData: dMatrix; //array of array of Double;
begin
  // Example reference data as 4 features with 7 samples
  SetMatrixLength(mData, 4, 7);
  SetMatrixLength(corrMatrix, 4, 4);
  mData[0]:= [1.0, 2.0, 3.0, 4.0, 5.0,6.0,7.0];
  mData[1]:= [22.5, 32.0, 42.0, 52.0,55.7,50.1,55.5];
  mData[2]:= [15.0, 16.0, 17.0, 19.0,28.9,30.0,32.4];
  mData[3]:= [25.0, 126.0, 127.0, 119.0,118.9,120.8,12.7];
  writeln('Test Matrix Data TM: '+flots(mdata[2][3]));
  CalculateCorrelationMatrix2(mdata, corrMatrix);

A heatmap in seaborn requires 2D input. Use data = np.asarray([b]) in this case. Then we convert those array into a numpy array and reshape it for a panda dataframe:

//4. Matplotlib & Seaborn Correlation Matrix 
  execstr('import matplotlib.pyplot as plt; import seaborn as sns');
     
  it:= 0;
  execstr('arr2 = np.empty(28, dtype = float)');
  for x:= 0 to 6 do 
     for y:= 0 to 3 do begin 
       execstr('arr2['+itoa(it)+']= '+flots(mdata[y][x]));
       inc(it)
     end;  
  execstr('data2 = np.asarray(arr2).reshape(7,4)'+LF+
          'df = pd.DataFrame(data2)');
  execstr('fig, ax = plt.subplots(figsize=(8, 6))'+LF+
          'plt.title("Delphi Matrix 2 PySeaborn Plot")'+LF+
          'sns.heatmap(df.corr(), ax=ax,annot=True); plt.show()');
          //}

As the plot in seaborn has the right shape (7,4) we compute the correlation matrix:

Data Science Demo

Symbolic Regression with Genetic Programming

It uses genetic programming, which evolves models over generations through mutation and crossover (similar to natural selection).

# !pip install gplearn

import numpy as np
import matplotlib.pyplot as plt
from gplearn.genetic import SymbolicRegressor

# Generate Sample Data
X = np.linspace(-10, 10, 100).reshape(-1, 1)
y = 3*np.sin(X).ravel() + 2*X.ravel()**2 - 4

# Initialize the Symbolic Regressor
sr = SymbolicRegressor(population_size=2000, 
                       generations=20, 
                       stopping_criteria=0.01, 
                       function_set=('add','sub','mul','div', 'sin','cos','sqrt','log'), 
                       p_crossover=0.7, 
                       random_state=42)

# Fit the model
sr.fit(X, y)

# Make Predictions
y_pred = sr.predict(X)
# plot
plt.scatter(X, y, color='black', label='True Data')
plt.plot(X, y_pred, color='red', label='Discovered Function')
plt.legend()
plt.show()
plot show
C:\maxbox\ipso\ICT2023\ict_mod231>py -3.12 -m pip install gplearn

Google Gemini API

What is the Gemini API?

The Gemini API is a set of tools and interfaces that allow developers to integrate Google’s Gemini family of large language models (LLMs) into their applications and services. It’s essentially your access point to leverage the power of these powerful AI models.

First to get your API Key

Here’s a breakdown of what that means:

Key Concepts:

  • Gemini Family of LLMs: Gemini refers to Google’s suite of multimodal AI models, designed to understand and generate text, code, images, audio, and video. They come in various sizes (e.g., Gemini Ultra, Gemini Pro, Gemini Nano) optimized for different use cases and performance requirements.
  • API (Application Programming Interface): An API is like a messenger that allows different software applications to communicate with each other. The Gemini API provides a standardized way for your application to send requests to the Gemini models and receive responses.
{$I .\GAPIKEY.INC}
 
function TRestClient3_AskChatGPTGemini(askstream: string; 
                                   aResponseHeader:TRestResponseHandler):string;
var JPostdat: string;
    jo: TJSON; rest: TRestResource;
begin
 (* JPostDat:= '{'+
    '"model": "text-davinci-003",'+
    '"prompt": "%s",'+
    '"max_tokens": 2048,'+
    '"temperature": 0.15}';   *)
   JPostDat:= '{'+
    '"contents": [{'+
    '"parts": [{"text": "%s"}]'+
    '}]}';  
 
  with TRestClient.create(self) do begin
      rest:= Resource('https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash:generateContent?key='
                                               +GEMINI_APIKEY);
      println('@addr:'+objtostr(rest))
      rest.ContentType('application/json');
      //rest.Authorization('Bearer '+CHATGPT_APIKEY2);               
      ConnectionType:= hctWinInet;
      OnResponse:= @TRestOnResponseEvent2;
    try
      jo:= TJSON.Create(); 
      jo.parse(rest.Post(TStringStream.createstring(format(JPostDat,[askstream])))); 
      //result:= jo.values['candidates'].asarray[0].asobject['parts'].asobject['text'].asstring;
      result:= jo.values['candidates'].asarray[0].asobject['content'].asobject['parts'].asarray[0].asobject['text'].asstring;
      //writeln(formatjson(jo.stringify));
    finally
      Free;
      jo.Free;
    except 
      writeln('EWI_Exc:'+ExceptiontoString(exceptiontype,exceptionparam));
    end; 
  end; //with   
end; 
 
Call:
writeln('resreal: '+TRestClient3_AskChatGPTGemini('Why is the earth blue?', Nil));
  • Integration: The Gemini API allows you to seamlessly incorporate the capabilities of these AI models into your applications. This means you can use them to build features that understand language, generate content, translate text, analyze data, and much more.

JSON Structure for Gemini API

The Gemini API uses the following basic structure when specifying generation parameters like temperature:

{
  "contents": [
    {
      "parts": [
        {
          "text": "Your text prompt goes here."
        }
      ]
    }
  ],
  "generation_config": {
      "temperature": 0.7,
      "max_output_tokens": 100
    }
}
JPostDat:= '{'+
    '"contents": [{'+
    '"parts": [{"text": "%s"}]'+
    '            }],               '+
    '"generation_config": {        '+
    '  "temperature": 0.7,         '+
    '  "max_output_tokens": 100    '+
    '   }                          '+
    '}';  

What You Can Do with the Gemini API:

The Gemini API opens up a wide range of possibilities, including:

  • Natural Language Processing (NLP):
    • Text generation: Generate creative content like poems, stories, articles, code, etc.
    • Text summarization: Condense large blocks of text into key points.
    • Text translation: Translate text between various languages.
    • Sentiment analysis: Determine the emotional tone of text.
    • Question answering: Provide answers to questions based on provided text.
    • Chatbots and conversational AI: Build interactive and engaging chatbot experiences.
  • Multimodal Understanding:
    • Image understanding: Extract information and context from images.
    • Video analysis: Analyze and understand video content.
    • Audio transcription and analysis: Convert speech to text and extract insights.
    • Combined text and image reasoning: Understand relationships between text and visual content.
  • Code Generation:
    • Code completion: Suggest code snippets and complete lines of code.
    • Code explanation: Provide explanations of existing code.
    • Code translation: Convert code between different programming languages.
  • Customization: Fine-tune the models for specific tasks and industries.
  • And much more: The capabilities are constantly evolving, making the possibilities virtually limitless.

Key Features of the Gemini API:

  • Scalability and Reliability: Built on Google’s robust infrastructure to handle high volumes of requests.
  • Ease of Use: Provides well-documented APIs with various client libraries (e.g., Python, Java, Go) to simplify integration.
  • Security: Implements security measures to protect your data and the models.
  • Flexibility: Supports various input and output formats, allowing for diverse application integration.
  • Regular Updates: Models are continuously improved with the latest advancements in AI research.

How to Access the Gemini API:

  1. Google Cloud Platform (GCP): The Gemini API is primarily accessed through Google Cloud Platform (GCP). You’ll need a GCP project and enable the API.
  2. API Keys and Authentication: You’ll use API keys or other authentication mechanisms to authorize your application to use the API.
  3. Client Libraries: Use the provided client libraries for your programming language of choice to interact with the API.
  4. Billing: API usage is typically based on the number of requests and model compute resources consumed.

The Google AI Python SDK is the easiest way for Python developers to build with the Gemini API. The Gemini API gives you access to Gemini models created by Google DeepMind. Gemini models are built from the ground up to be multimodal, so you can reason seamlessly across text, images, and code.

Get started with the Gemini API in Python

The Google AI Python SDK is the easiest way for Python developers to build with the Gemini API. The Gemini API gives you access to Gemini models created by Google DeepMind. Gemini models are built from the ground up to be multimodal, so you can reason seamlessly across text, images, and code.

  1. Go to Google AI Studio.
  2. Login with your Google account.
  3. Create an API key.
  4. Try a Python SDK quickstart in the Gemini API Cookbook.
  5. For detailed instructions, try the Python SDK tutorial on ai.google.dev.
#Get the Python Package
pip install -U google-generativeai

#Import the SDK and configure your API key.
import google.generativeai as genai
import os

genai.configure(api_key=os.environ["GEMINI_API_KEY"])

#Create a model and run a prompt.

#model = genai.GenerativeModel('gemini-pro')
model = genai.GenerativeModel('gemini-1.5-flash')
response = model.generate_content("Why is the earth blue?")
print(response.text)

Answer: The Earth is blue because of the oceans. Water absorbs longer wavelengths of light (like red and yellow) and reflects shorter wavelengths (like blue). Since the Earth’s surface is about 71% water, the blue light reflected from the oceans dominates the color we see from space.

const DEF_llmfunc = 
   'def llm_function(query):                                          '+LF+
   '  response = model.generate_content(query) # Generate response    '+LF+
   //'  st.markdown(response.text) # Print it out using streamlit       ';  
   '  print(response.text) # Print it out using NOT streamlit       ';  

procedure Gemini_PySolution(loc: string);
begin
 with TPythonEngine.Create(Nil) do begin
   //pythonhome:= PYHOME64;
   loaddll;
   autofinalize:= false;
   try
     Execstring('from math import sqrt');
     Execstring('import streamlit as st');
     Execstring('from bs4 import BeautifulSoup');
     Execstring('import os, requests');
     Execstring('import google.generativeai as genai');
    //  Execstring('from PIL import Image; import os, math, collections');
     
     Execstr('import io, sys');      
     //Execstring('import plotly.express as px');
     Execstr(PY_count); 
     Execstr(DEF_llmfunc);
     Execstr(DEF_GEMiniInput);
     execstr('output = io.StringIO()') 
     execstr('sys.stdout = output');  
     
     execstr('st.title("Proposal Calls") # Title for the page');
     
     execstr('os.environ[''GOOGLE_API_KEY'']='+loadfile3(exepath+'geminiapikey.txt')+'');
     //execstr('os.environ[''GOOGLE_API_KEY'']="'+loadfile3(exepath+'geminiapikey2.txt')+'"');
     execstr('genai.configure(api_key = os.environ[''GOOGLE_API_KEY''])');
     //execstr('model = genai.GenerativeModel(''gemini-pro'')');
     
     execstr('model = genai.GenerativeModel(''gemini-1.5-flash'')');
     execstr('response = model.generate_content("Why is the earth blue?")');
     execstr('print(response.text) ');
     //execstr('(read_http_input()) ');

     println(stringreplace(evalstr('output.getvalue().strip()'),
                            #10,#13#10,[rfReplaceAll, rfIgnoreCase])); 
   except
      raiseError;
   finally
      UnloadDll;
      Free;
   end;
 end; 
end; 

In Summary:

The Gemini API is the bridge that empowers developers to integrate the transformative capabilities of Google’s Gemini AI models into their applications. It opens up a world of possibilities for building innovative solutions, enhancing existing services, and creating entirely new user experiences. By utilizing the power of these advanced language models, developers can revolutionize how users interact with technology.

Interact with Python or Delphi

Thats what Google AI Studio says:

Okay, let’s talk about using maXbox5 in conjunction with the Gemini API. While maXbox5 is a powerful environment for scripting and automation, there are some important limitations and considerations when it comes to directly programming with the Gemini API within maXbox5 itself:

Challenges:

  1. Direct API Access:
    • maXbox5 is primarily designed for its own scripting language (based on Pascal) and system-level interactions. It doesn’t inherently have built-in support for making direct HTTP requests or handling complex JSON responses required by web APIs like the Gemini API.
  2. Authentication:
    • Accessing the Gemini API requires authentication with API keys or OAuth tokens. Managing these securely and implementing the complex authentication protocols within maXbox5 would be quite challenging.
  3. Complex Data Structures:
    • The Gemini API returns data in complex JSON formats, which maXbox5’s scripting language might not readily parse. You’d need to write custom parsing logic, which could become quite cumbersome.
  4. External Libraries and Dependencies:
    • The most practical way to interact with the Gemini API is using a programming language with a suitable client library (e.g., Python’s google-generativeai library). maXbox5’s scripting environment doesn’t allow direct inclusion of these external libraries.
  5. Performance:
    • If you tried to handle all API requests and response processing directly within maXbox5’s scripting, performance might become a concern, especially for larger or more complex interactions.

How You Could Potentially Integrate (with Limitations):

Given the limitations, directly calling the Gemini API from within maXbox5’s scripts isn’t the most practical approach. Here’s how you could theoretically integrate them, but with significant limitations:

  1. Using External Executables or Scripts:
    • Python Script: You could write a separate Python script that uses the google-generativeai library to interact with the Gemini API. The script would accept input arguments and return results.
    • maXbox5 Calls: Your maXbox5 script could then use the ExecuteProcess function (or similar) to call this external Python script, passing data via command-line arguments.
    • Parsing Results: You’d need to parse the output from the Python script (likely in JSON format) back into a format that maXbox5 can use. This would involve string manipulation and potentially some custom parsing functions.
    // maXbox5 script example (VERY SIMPLIFIED): var PythonScript: string; Prompt : string; Output: string; Cmd : string; begin PythonScript := 'C:\path\to\your\gemini_script.py'; Prompt := 'Write a short poem about flowers'; Cmd := Format('%s "%s"', [PythonScript, Prompt]); Output := ExecuteProcess(Cmd, nil); if Output <> '' then begin writeln('Output from Python: ', Output); // parse the output for needed values end; end;content_copydownloadUse code with caution.Maxscript
  2. Using HTTP GET/POST (Very Complicated):
    • You could potentially use the HTTP functions (if available in maXbox5) to try and construct direct HTTP requests to the Gemini API.
 JPostDat:= '{'+
    '"contents": [{'+
    '"parts": [{"text": "%s"}]'+
    '}]}';  
 
  with TRestClient.create(self) do begin
      rest:= Resource('https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash:generateContent?key='
                                               +GEMINI_APIKEY);
      println('@addr:'+objtostr(rest))
      rest.ContentType('application/json');
      //rest.Authorization('Bearer '+CHATGPT_APIKEY2);               
      ConnectionType:= hctWinInet;
      OnResponse:= @TRestOnResponseEvent2;
    try
      jo:= TJSON.Create(); 
      jo.parse(rest.Post(TStringStream.createstring(format(JPostDat,[askstream])))); 
  1. However, this would involve complex manual construction of HTTP headers, JSON data, handling authentication tokens, and processing the responses. This would be incredibly complex and error-prone. I wouldn’t recommend it.

Why It’s Not Recommended:

  • Complexity: The integration approach is quite complex. It would involve a considerable amount of custom coding and is far from being a natural fit for maXbox5’s intended use.
  • Maintenance: This setup becomes brittle and difficult to maintain. If the Gemini API changes, you would likely need to update both the Python script and your maXbox5 script.
  • Performance: The overhead of executing external processes, parsing data, etc., would negatively impact the performance of your automation.
  • Security Risks: Hardcoding credentials within scripts, even in external ones, poses security risks.

Best Approach:

If your goal is to use the Gemini API for scripting and automation tasks, it’s highly recommended to use a more suitable scripting language like Python, Delphi or JavaScript, which offer excellent libraries for interacting with web APIs like the Gemini API and much better JSON handling capabilities.

Conclusion:

While you could technically try to integrate maXbox5 with the Gemini API through external scripts or direct HTTP requests, it’s not a practical or efficient approach. MaXbox5 is best used for its core strengths in system automation and scripting with its own internal environment. For interacting with complex APIs like the Gemini API, other languages and ecosystems are better suited.

If you can share what kind of task you’re trying to accomplish, I can suggest better tools and approaches.thumb_upthumb_down

8.1s

JSON Testtemplate

const JSONGEMINI_TEST =

 ' {                                                     '+LF+
 '   "candidates": [                                     '+LF+
 '       {                                               '+LF+
 '           "content": {                                '+LF+
 '               "parts": [                              '+LF+
 '                   {                                   '+LF+
 '                      "text": "The Earth is blue because of the oceans.  Water absorbs longer wavelengths of light,         '+LF+
 '                      like red and yellow, and reflects shorter wavelengths, like blue.  Since the Earth''s surface is       '+LF+
 '                       about 70% water, the blue light reflected from the oceans dominates the light we see from space,     '+LF+
 '                       making the planet appear blue."                                                                    '+LF+
 '                   }                                   '+LF+ 
 '               ],                                      '+LF+ 
 '               "role": "model"                         '+LF+ 
 '           },                                          '+LF+ 
 '           "finishReason": "STOP",                     '+LF+ 
 '           "avgLogprobs": -0.10744373122258,           '+LF+ 
 '           "parts": null                               '+LF+ 
 '       }                                               '+LF+ 
 '   ],                                                  '+LF+ 
 '   "usageMetadata": {                                  '+LF+ 
 '       "promptTokenCount": 6,                          '+LF+ 
 '       "candidatesTokenCount": 67,                     '+LF+ 
 '       "totalTokenCount": 73                           '+LF+ 
 '   },                                                  '+LF+ 
 '   "modelVersion": "gemini-1.5-flash"                  '+LF+ 
'}                                                       ';

getGemini AI__________________: 
@addr:TRestResource@AA15A2B0
@addr:TRestClient@B0861030 response cont: {
    "candidates": [
        {
            "content": {
                "parts": [
                    {
                        "text": "The Italian word \"conclusione\" translates to **Schlussfolgerung** or **Fazit** in German.  Both words convey the idea of a conclusion, but there's a subtle difference:\n\n* **Schlussfolgerung** emphasizes the process of drawing a conclusion, the reasoning leading to it.  It's more formal.\n\n* **Fazit** is a more concise summary of the main points, the final statement. It's often used in a less formal context.\n\nSo, depending on the context of the Italian \"conclusione,\" either \"Schlussfolgerung\" or \"Fazit\" could be the best translation.\n"
                    }
                ],
                "role": "model"
            },
            "finishReason": "STOP",
            "avgLogprobs": -0.10391504961745184
        }
    ],
    "usageMetadata": {
        "promptTokenCount": 11,
        "candidatesTokenCount": 133,
        "totalTokenCount": 144
    },
    "modelVersion": "gemini-1.5-flash"
} response code: 200 enabled compression True content-encoding: verifycert: True

And the access from different JSON Libs:

jsn:= TMcJsonItem.Create(); 
jsn.AsJSON:=formatjson(JSONGEMINI_TEST);
writ(jsn.values['candidates'].items[0]['content'].items[0]['text'].items[0].asstring);
    jsno:= jsn.values['candidates'].items[0]['content'].items[0]['text'].asobject;
    writ(StringReplace(jsno['text'].asstring,'\n',CRLF,[rfReplaceAll]));

jsn:= TMcJsonItem.Create(); 
jsn.AsJSON:=formatjson(JSONGEMINI_TEST);

try
   jo:= TJSON.Create(); 
   jo.parse(rest.Post(TStringStream.createstring(format(JPostDat,[askstream])))); 
   result:= jo.values['candidates'].asarray[0].asobject['content'].asobject['parts'].asarray[0].asobject['text'].asstring;
     //writeln(formatjson(jo.stringify));
   finally
     Free;
     jo.Free;
   except  
      writeln('EWI_Exc:'+ExceptiontoString(exceptiontype,exceptionparam));
   end; 

You see candidates and parts are arrays of objects!

Jouef CC 6526 & Lemaco CC 40110 Nice
Roland Jupiter-X
AI Regression Test

MIDI Magic

The General MIDI specification defines 128 instruments, and 47 percussion sounds. All channels except channel 9 (counting from zero) play instruments, channel 9 plays percussion sounds, with different note numbers resulting in different sounds. The MIDI standard itself does not define any instruments or percussion sounds.

Other specifications (General MIDI 2, GS, XG etc.) define more sounds, and have mechanisms to select which channel(s) to use for percussion sounds.

So first we set our Midi Controller or Device:

var note: TMidinote;
    tmidi: TJclMIDIOut;
    fMidiOut: IJclMidiOut;
    fChannel: TMidiChannel;

mlist:= THashedStringList.create;
   GetMidiOutputs(mlist);  //check for midi devices
   writeln(mlist.text)
   mlist.free;
fmidiout:= MIDIOut(0);
//fmidiout.SendMessage(const Data: array of Byte);
fmidiout.SwitchPolyModeOn(16);
writ(fmidiout.getname);
fmidiout.SendNoteOn(2, note+36, $7f); //test tone

So we set an interface from IJclMidiOut and we get an instance of JclWinMIDI.MIDIOut(DeviceID);

type
  TJclWinMidiOut = class(TJclMidiOut, IJclWinMidiOut)
function MidiOut(DeviceID: Cardinal): IJclWinMidiOut;
procedure GetMidiOutputs(const List: TStrings);
procedure MidiOutCheck(Code: MMResult);

We can see our device as Microsoft GS Wavetable Synth

Next we play a tune:

playTune(['g4','g#4','a#4','c5','d5','d#5','f5','g5'], 500, 4, true, fmidiout);
//https://www.hooktheory.com/cheat-sheet/key/g/phrygian
maxform1.Showmidiform(self);

This tune is from a G Phrygian and the key of G Phrygian has a key signature of 3 flats (B♭, E♭, and A♭).  In our notation its a sharp so A♭ is a g#.

PlayTune is a procedure which calls the midi device and controller in the following way:

procedure PlayTune(tune: array of string;pause: integer;octave: byte;fin: boolean;fmidi: IJclMidiOut);
var i, anote: integer;
begin
  for i:= 0 to High(tune) do begin
     anote:= StrtoMIDINote2(tune[i],octave);
     fmidi.SendNoteOn(2, anote, $7f); 
     delay(pause)  
     fmidi.SendNoteOff(2, anote, $7f); 
  end;
  if fin then sleep(1500);  
end;

We send note on and off messages in a sequence and we pass the device as fmidi. Noteon (i-rate note on) and noteoff (i-rate note off) are the simplest MIDI OUT opcodes. noteon sends a MIDI noteon message to MIDI OUT port, and noteoff sends a noteoff message. A noteon opcode must always be followed by an noteoff with the same channel and number inside the same instrument, otherwise the note will play endlessly. So we can also change the instrument in passing a midi channel and the instrument id:

MIDIMsgProgramChange = $C0;
fmidiout.SendProgramChange(2, 7);  //channel 2, instrument 7

To get a midi note we call the function StrtoMIDINote2(tune[i],octave);
function StrtoMIDINote2(Notestr: string; octave: byte): TMIDINote;

This conv.implementation assumes that the input string is in the format ‘NoteOctave’ (e.g., ‘C4’, ‘F#3’, ‘B5’,A4). 
It handles both sharp notes (with ‘#’) and natural notes but not flats one. If an invalid note is provided, the function returns -1.

It converts a string representation of a musical note to its corresponding MIDI note number. Here’s an implementation:  This function does the following:

  • It defines a constant array of note names.
    mnotes:= [‘C’,’C#’,’D’,’D#’,’E’,’F’,’F#’,’G’,’G#’,’A’,’A#’,’B’];
  • It extracts the note name and octave from the input string.
    NoteName := AnsiUpperCase(Copy(NoteStr, 1, Length(NoteStr) – 1));  
    Octave := StrToIntDef(Copy(NoteStr, Length(NoteStr), 1), 4);
  • It finds the index of the note in the Notes mnotes array by looping.
for I:= 0 to High(mNotes) do 
   if mNotes[I] = NoteName then begin
     NoteIndex:= I;
     write(mnotes[i]+' ')
     Break;
   end;

It calculates the MIDI note number using the formula: (Octave + 1) * 12 + NoteIndex.

https://studio.moises.ai/library/

Chord Progression

We can also play a chord progression. A chord progression is a sequence of two or more chords played one after the other. This is the chord:

https://www.hooktheory.com/cheat-sheet/key/g/phrygian

   for it:= 1 to 3 do begin

    playChord([‘g2′,’a#3′,’d4′,’g4’], 1100, false, fmidiout);   //gm

    playChord([‘c3′,’c4′,’d#4′,’g4’], 1100, false, fmidiout);  //cm

    playChord([‘a#2′,’a#3′,’d4′,’f4’], 1000, false, fmidiout); //bflat

    playChord([‘g#2′,’c4′,’d#4′,’g#4’], 900, true, fmidiout); //aflat 

   end//}

https://www.hooktheory.com/cheat-sheet/key/g/phrygian

http://www.softwareschule.ch/examples/1321_KCP_Phrygian_4.mid

A cool function is the MIDI Note Frequency:

const
  HalftonesPerOctave = 12;
  MiddleA            = 440.0; // Hertz
  MidiMiddleA        = 69;    // A4 = 440 Hertz

function MIDINote(Hertz: Extended): Extended;
begin
  if Hertz < 1.0 then
    result:= mininteger  //Low(Integer)
  else
    result:= LogBase2(Hertz/MiddleA)*HalftonesPerOctave+MidiMiddleA;
end;

See also: https://www.colincrawley.com/midi-note-to-audio-frequency-calculator/

The Windows implememtation of the IJclWinMidiOut interface (not to confuse with the MIDI hardware interface) defines also stereo channels (not to confuse with a midi channel) with volume controls by both sides:

type
  TStereoChannel = (scLeft, scRight);
  
  // MIDI Out Definition
  IJclWinMidiOut = interface(IJclMidiOut)
    ['{F3FCE71C-B924-462C-BA0D-8C2DC118DADB}']
    // property access methods
    function GetChannelVolume(Channel: TStereoChannel): Word;
    procedure SetChannelVolume(Channel: TStereoChannel; const Value: Word);
    function GetVolume: Word;
    procedure SetVolume(const Value: Word);
    // properties
    property ChannelVolume[Channel: TStereoChannel]: Word read GetChannelVolume write SetChannelVolume;
    property Volume: Word read GetVolume write SetVolume;
  end;

Source of the script: http://www.softwareschule.ch/examples/midi3.htm

Source of the song: http://www.softwareschule.ch/download/zeitraum.mp3

modes of major scale

Code the Circle of Fifth

In music theory, the circle of fifths (sometimes also cycle of fifths) is a way of organizing pitches as a sequence of perfect fifths. Starting on a C, and using the standard system of tuning for Western music (12-tone equal temperament).

So we always start with C-Major and add an note-offset of 7 (a quint):

//g - a - h ^ c - d - e - fis ^ g  G-Dur
playTune2(['c4','d4','e4','f4','g4','a4','b4','c5'],300,7,true,fmidiout);
//d - e - fis ^ g - a - h - cis ^ d   D-Dur
playTune2(['c4','d4','e4','f4','g4','a4','b4','c5'],300,14,true,fmidiout);
//a - h - cis ^ d - e - fis - gis ^ a  A-Dur
PlayTune2(['c4','d4','e4','f4','g4','a4','b4','c5'],300,21,true,fmidiout);
//e - fis - gis ^ a - h - cis - dis ^ e  E-Dur
//https://de.wikipedia.org/wiki/E-Dur
playTune2(['c4','d4','e4','f4','g4','a4','b4','c5'],300,28,true,fmidiout);
https://de.wikipedia.org/wiki/C-Dur

And the result will be:

G4(67) A4(69) B4(71) C5(72) D5(74) E5(76) F#5(78) G5(79)

D5(74) E5(76) F#5(78) G5(79) A5(81) B5(83) C#6(85) D6(86)

A5(81) B5(83) C#6(85) D6(86) E6(88) F#6(90) G#6(92) A6(93)

E6(88) F#6(90) G#6(92) A6(93) B6(95) C#7(97) D#7(99) E7(100)

mX5🐞 executed: 18/11/2024 11:58:14 Runtime: 0:0:19.81 Memload: 69% use

Script: http://www.softwareschule.ch/examples/midi4.txt

The code is straight and based on conversions with StrtoMIDINote3(tune[i], offset); to set the offset and octave. This conversion as an array of string implementation assumes that the input string is in the format ‘NoteOctave’ (e.g., ‘C4’, ‘F#3’, ‘B5’,A4) and calls a function StrtoMIDINote3 within the procedure PlayTune2():

function StrtoMIDINote3(Notestr: string; offset: byte): TMIDINote;
var i: integer;
    NoteName: string;
    Octave, NoteIndex: Integer;  note:TMidinote;
begin
  mnotes:=['C','C#','D','D#','E','F','F#','G','G#','A','A#','B']; //+
  NoteName:= AnsiUpperCase(Copy(NoteStr,1, Length(NoteStr)- 1));
  Octave:= StrToIntDef(Copy(NoteStr,Length(NoteStr), 1), 4);
  // Find note index
  NoteIndex:= -1;
  for I:= 0 to High(mNotes) do 
    if mNotes[I] = NoteName then begin
      NoteIndex:= I+offset;
      Break;
    end;
  // Calculate MIDI note number & display
  if NoteIndex <> -1 then begin
    Result:= (Octave+1)*12 + NoteIndex
    write(midiNotetostr(result+12)+'('+itoa(result)+')'+' ');
  end else
    Result:= -1; // Invalid note 
end;

procedure PlayTune2(tune: array of string; pause,offset: integer; fin:boolean; fmidi: IJclMidiOut);
var i, anote: integer;
begin
  for i:= 0 to High(tune) do begin
     anote:= StrtoMIDINote3(tune[i], offset);
     fmidi.SendNoteOn(2, anote, $7f); 
     delay(pause)  
     fmidi.SendNoteOff(2, anote, $7f); 
  end;
  if fin then sleep(1500);  
  writeln(CRLF);
end;
https://de.wikipedia.org/wiki/C-Dur
ModeTonic relative
to major scale
Interval sequenceExample
IonianIW–W–H–W–W–W–HC–D–E–F–G–A–B–C
DorianiiW–H–W–W–W–H–WD–E–F–G–A–B–C–D
PhrygianiiiH–W–W–W–H–W–WE–F–G–A–B–C–D–E
LydianIVW–W–W–H–W–W–HF–G–A–B–C–D–E–F
MixolydianVW–W–H–W–W–H–WG–A–B–C–D–E–F–G
AeolianviW–H–W–W–H–W–WA–B–C–D–E–F–G–A
LocrianviiøH–W–W–H–W–W–WB–C–D–E–F–G–A–B

For the sake of simplicity, the examples shown above are formed by natural notes (also called “white notes”, as they can be played using the white keys of a piano keyboard). However, any transposition of each of these scales is a valid example of the corresponding mode.

https://flypaper.soundfly.com/write/scale-mode-cheat-sheet-how-to-remember-each-and-every-mode/
https://www.hooktheory.com/cheat-sheet/key/c/minor
5 Nation Locomotion
Pip Install
Short History of Industrial Design
BPM #120

4 Machine Learning Object Detection Solutions

Object Detection APIs and Libraries provides a fast and accurate image object recognition using advanced neural networks developed by machine learning experts and models. It also supports object detection, video detection and object tracking using RetinaNet, YOLOv3 and TinyYOLOv3 pre-trained on datasets.

Solution Script:
https://github.com/maxkleiner/HttpComponent/blob/main/1316_detector25_integrate4solutions.txt

We deliver 4 showcases with the same image to compare and evaluate:

As we can see the algorithm, data and the result is distributable and scalable:

       Algorithm (model)  Data (image)    Result (Json)  Tech
   1.  local -    local -  local python core
   2.  cloud -    local -  local/cloud post API
   3.  cloud -    cloud -  local/cloud get API
   4.  cloud -    cloud -  local rest API

The first solution starts with tiny-yolov3.pt model from imagai:

'#using the pre-trained TinyYOLOv3 model,                          
detector.setModelTypeAsTinyYOLOv3()                               
detector.setModelPath(model_path)                                 
'#loads model path specified above using setModelPath() class method.   
detector.loadModel()                                                      
custom=detector.CustomObjects(person=True,laptop=True,car=False,train=True, clock=True, chair=False, bottle=False, keyboard=True)
The reference image for the solutions

Result: Start with maXbox5 ImageAI Detector —>
train : 80.25 %
integrate image detector compute ends…

elapsedSeconds:= 4.879268800000 no console attached..
mX5🐞 executed: 29/07/2024 09:53:49 Runtime: 0:0:8.143 Memload: 75% use

The we asked why the model can’t see the persons? It depends on the frame, so by cutting the image (crop) it sees persons but no train anymore!

input_path=r"C:\maxbox\maxbox51\examples\1316_elsass_20240728_161420crop.jpg"

Result: Start with maXbox5 ImageAI Detector —>
this first line fine
person : 99.29 %
person : 99.58 %
person : 98.74 %
integrate image detector compute ends…
elapsedSeconds:= 4.686975000000 – no console attached..
mX5🐞 executed: 29/07/2024 10:09:30 Runtime: 0:0:7.948 Memload: 77% use

You can see one false positive in the green bounding box above!

The Second Solution is an API from URL_APILAY_DETECT = ‘https://api.api-ninjas.com/v1/objectdetection/&#8217;;

The Object Detection API provides fast and accurate image object recognition using advanced neural networks developed by machine learning models.

https://api-ninjas.com/api/objectdetection

const URL_APININ_DETECT= 'https://api.api-ninjas.com/v1/objectdetection/';

function TestHTTPClassComponentAPIDetection2(AURL, askstream, aApikey: string): string;
var HttpReq1: THttpRequestC;
    Body: TMultipartFormBody;
    Body2: TUrlEncodedFormBody;  //ct: TCountryCode;
begin
  Body:= TMultipartFormBody.Create;
  Body.ReleaseAfterSend:= True;
  //Body.Add('code','2','application/octet-stream');
  //Body.AddFromFile('image', exepath+'randimage01.jpg');
  Body.AddFromFile('image',
                           'C:\maxbox\maxbox51\examples\1316_elsass_20240728_resized.jpg');
  
  HttpReq1:= THttpRequestC.create(self);
  httpreq1.useragent:= USERAGENT3;
  httpReq1.headers.add('X-Api-Key:'+AAPIKEY);
  httpReq1.headers.add('Accept:application/json');
  hthtpReq1.SecurityOptions:= [soSsl3, soPct, soIgnoreCertCNInvalid];
  try
    if HttpReq1.Post1Multipart(AURL, body) then 
       result:=HttpReq1.Response.ContentAsString
    else Writeln('APIError '+inttostr(HttpReq1.Response.StatusCode2));
  finally  
    writeln('Status3: '+gethttpcod(HttpReq1.Response.statuscode2))
    HttpReq1.Free;  
    sleep(200)
    // if assigned(body) then body.free;
  end; 
end;

This result is a post from a multipartform body stream and you need an API key, then the result is a JSON back, as you can see, we need a call to HttpReq1.Post1Multipart for uploading files.:
POST data using the Content-Type multipart/form-data

Result Status3: SC_OK
back [ {
"label": "train",
"confidence": "0.76",
"bounding_box": {
"x1": "-6",
"y1": "291",
"x2": "1173",
"y2": "1347"
}
},
{
"label": "person",
"confidence": "0.72",
"bounding_box": {
"x1": "535",
"y1": "854",
"x2": "815",
"y2": "1519"
}
},
{
"label": "person",
"confidence": "0.69",
"bounding_box": {
"x1": "823",
"y1": "790",
"x2": "1055",
"y2": "1350"
}
},
as JSON back

The third solution wants to get the text back from the image. The Image to Text API detects and extracts text from images using state-of-the-art optical character recognition (OCR) algorithms. It can detect texts of different sizes, fonts, and even handwriting on pictures or draws.

URL_APILAY_IMG2TEXT = 'https://api.apilayer.com/image_to_text/url?url=%s';

function Image_to_text_API2(AURL, url_imgpath, aApikey: string): string;
var httpq: THttpConnectionWinInet;
    rets: TStringStream;  
    heads: TStrings; iht: IHttpConnection; //losthost:THTTPConnectionLostEvent;
begin
  httpq:= THttpConnectionWinInet.Create(true); 
  rets:= TStringStream.create('');
  heads:= TStringlist.create;     
  try 
    heads.add('apikey='+aAPIkey);
    iht:= httpq.setHeaders(heads);
    httpq.Get(Format(AURL,[url_imgpath]), rets);
    if httpq.getresponsecode=200 Then result:= rets.datastring
      else result:='Failed:'+
             itoa(Httpq.getresponsecode)+Httpq.GetResponseHeader('message');
  except  
    writeln('EWI_HTTP: '+ExceptiontoString(exceptiontype,exceptionparam));
  finally
    httpq:= Nil;
    heads.Free;
    rets.Free;
  end;                  
end;                 //}

And the model is able to read the name of the Locomotive!:

Result_: {“lang”:”en”,”all_text”:”18130\nBERTHOLD“,”annotations”:[“18130″,”BERTHOLD“]}
mX5🐞 executed: 29/07/2024 11:04:12 Runtime: 0:0:3.527 Memload: 81% use

The forth and last solution in this machine learning package is a Python one as in Python for maXbox or Python4Delphi available:

procedure PyCode(imgpath, apikey: string);
begin
  with TPythonEngine.Create(Nil) do begin
  //pythonhome:= 'C:\Users\User\AppData\Local\Programs\Python\Python312\';
  try
    loadDLL;
    autofinalize:= false;
    ExecString('import requests, sys');
    ExecStr('url= "https://api.apilayer.com/image_to_text/url?url='+imgpath+'"'); 
    ExecStr('payload = {}');  
    ExecStr('headers= {"apikey": "'+apikey+'"}'); 
    Println(EvalStr('requests.request("GET",url,headers=headers, data=payload).text')); 
    Println('Version: '+EvalStr('sys.version'));  
  except
    raiseError();        
  finally       
    free;
  end; 
 end;
end; 

{“lang”: “en”, “all_text”: “18130\nBERTHOLD”, “annotations”: [“18130”, “BERTHOLD“]}

Version: 3.12.4 (tags/v3.12.4:8e8a4ba, Jun 6 2024, 19:30:16) [MSC v.1940 64 bit (AMD64)]
mX5🐞 executed: 29/07/2024 11:18:13 Runtime: 0:0:4.60 Memload: 79% use

S.A.C.M. Elsässische Maschinenbauanstalt Graffenstaden C-Kuppler (2x)
– Baujahr 1900

Conclusion and Summary

  1. Built with simplicity in mind, ImageAI supports a list of state-of-the-art Machine Learning algorithms for image prediction, custom image prediction, object detection, video detection, video object tracking and image predictions trainings. ImageAI currently supports image prediction and training using 4 different Machine Learning algorithms trained on the ImageNet-1000 dataset. ImageAI also supports object detection, video detection and object tracking using RetinaNet, YOLOv3 and TinyYOLOv3 trained on COCO dataset. Finally, ImageAI allows you to train custom models for performing detection and recognition of new objects.
    https://github.com/OlafenwaMoses/ImageAI
  2. Object Detection API – The Object Detection API provides fast and accurate image object recognition using advanced neural networks developed by machine learning experts. It also has a Live Demo or rules with Mime Post Multipart FormData_:
    https://api-ninjas.com/api/objectdetection
    https://github.com/maxkleiner/HttpComponent
  3. Recognizes and reads the text embedded in images very accurare and usable.
    Image to Text API uses a neural net (LSTM) based OCR engine which is focused on line recognition, but also supports recognizing the character patterns. It supports both handwriting and printed materials.
    It will extract the text information easily, even though the text or number is positioned with angle, like Berthold.
    https://apilayer.com/marketplace/image_to_text-api
  4. The Requests library in Python is one of the integral parts of Python for making HTTP requests to a specified URL as post or get. Whether it be REST APIs or Web Scraping, requests are a must to be learned for proceeding further with these technologies.
  5. Out of the examples above but also mention it: The Face Detect API uses state of the art computer vision algorithms to accurately and efficiently detect faces in images.
    https://api-ninjas.com/api/facedetect
The Face Detect API
Object Detect Def-End

{
“lang”: “und”,
“all_text”: “DB\n218 105-5\n000\nDB\nV 200 035\n1000\nว”,
“annotations”: [
“DB”,”218″,”105-5″,”000″,
“DB”, “V”,”200″,”035″,”1000″,
“ว”]
}
mX5🐞 executed: 22/08/2024 17:45:23 Runtime: 0:0:3.414 Memload: 77% use

unreal AI
jumping jack max

Object Detection API

We call it AIM and this stands for Artificial Intelligence Machine.

The Object Detection API provides fast and accurate image object recognition using advanced neural networks developed by machine learning experts.

https://api-ninjas.com/api/objectdetection

https://github.com/maxkleiner/HttpComponent

After you got you API-Key (API Key associated with your account) we need a httpcomponent of classes which are enable to post multipart-formdata-feed or streams. An HTTP multipart request is an HTTP request that HTTP clients construct to send files and data over to an HTTP Server. It is commonly used by browsers and HTTP clients to upload files to the server.

The content type “multipart/form-data” should be used for submitting forms that contain files, non-ASCII data, and binary data combined in a single body.

Live Deno of the API

So a multipart request is a request containing several packed requests inside its entity and we can script that:

const URL_APILAY_DETECT = 'https://api.api-ninjas.com/v1/objectdetection/';

function TestHTTPClassComponentAPIDetection2(AURL, askstream, aApikey: string): string;
var HttpReq1: THttpRequestC;
    Body: TMultipartFormBody;
    Body2: TUrlEncodedFormBody;
begin
  Body:= TMultipartFormBody.Create;
  Body.ReleaseAfterSend:= True;
  //Body.Add('code','2','application/octet-stream');
  Body.AddFromFile('image', exepath+'randimage01.jpg');
  HttpReq1:= THttpRequestC.create(self);
  HttpReq1.headers.add('X-Api-Key:'+AAPIKEY);
  HttpReq1.headers.add('Accept:application/json');
  try
    if HttpReq1.Post1Multipart(AURL, body) then 
       writeln(HttpReq1.Response.ContentAsString)
    else Writeln('APIError '+inttostr(HttpReq1.Response.StatusCode2));
  finally  
    writeln('Status3: '+gethttpcod(HttpReq1.Response.statuscode2))
    HttpReq1.Free;  
    sleep(200)
    // if assigned(body) then body.free;
  end; 
end;

print(TestHTTPClassComponentAPIDetection2(URL_APILAY_DETECT,' askstream',N_APIKEY));

The result returns a return a list of detected objects labels, confidence percentages and bounding boxes. Objects with confidence less than 0.3 (30%) are filtered out.

🙂 True
[{“label”: “person“, “confidence”: “0.88”, “bounding_box”: {“x1”: “329”, “y1”: “278”, “x2”: “423”, “y2”: “420”}}, {“label”: “surfboard“, “confidence”: “0.46”, “bounding_box”: {“x1”: “316”, “y1”: “407”, “x2”: “402”, “y2”: “424”}}, {“label”: “skis“, “confidence”: “0.42”, “bounding_box”: {“x1”: “316”, “y1”: “407”, “x2”: “402”, “y2”: “424”}}]
Status3: SC_OK

Discussion: Yeah the machine learnings got the person with high confidence and the surfboard is more likely than the skis, which are out of probability context; I mean do you see any sea or snow?!

mX5🐞 executed: 27/04/2024 10:21:59 Runtime: 0:0:6.160 Memload: 74% use
RemObjects Pascal Script. Copyright (c) 2004-2024 by RemObjects Software & maXbox5

Run as Script
Märklin Europa

[{“label”: “bus“, “confidence”: “0.65”, “bounding_box”: {“x1”: “56”, “y1”: “240”, “x2”: “1702”, “y2”: “695”}}, {“label”: “truck“, “confidence”: “0.59”, “bounding_box”: {“x1”: “56”, “y1”: “240”, “x2”: “1702”, “y2”: “695”}}, {“label”: “person”, “confidence”: “0.44”, “bounding_box”: {“x1”: “1461”, “y1”: “325”, “x2”: “1523”, “y2”: “374”}}, {“label”: “truck”, “confidence”: “0.43”, “bounding_box”: {“x1”: “143”, “y1”: “547”, “x2”: “737”, “y2”: “693”}}, {“label”: “person”, “confidence”: “0.39”, “bounding_box”: {“x1”: “1533”, “y1”: “326”, “x2”: “1583”, “y2”: “371”}}, {“label”: “person”, “confidence”: “0.36”, “bounding_box”: {“x1”: “203”, “y1”: “323”, “x2”: “260”, “y2”: “370”}}, {“label”: “train”, “confidence”: “0.36”, “bounding_box”: {“x1”: “56”, “y1”: “240”, “x2”: “1702”, “y2”: “695”}}, {“label”: “car”, “confidence”: “0.35”, “bounding_box”: {“x1”: “156”, “y1”: “557”, “x2”: “731”, “y2”: “686”}}, {“label”: “person”, “confidence”: “0.31”, “bounding_box”: {“x1”: “1472”, “y1”: “340”, “x2”: “1518”, “y2”: “374”}}, {“label”: “person”, “confidence”: “0.31”, “bounding_box”: {“x1”: “261”, “y1”: “325”, “x2”: “320”, “y2”: “370”}}, {“label”: “person”, “confidence”: “0.3”, “bounding_box”: {“x1”: “388”, “y1”: “332”, “x2”: “434”, “y2”: “375”}}]

Trix Express 2243
Supertramp

[{“label”: “dining table“, “confidence”: “0.63”, “bounding_box”: {“x1”: “122”, “y1”: “308”, “x2”: “604”, “y2”: “679”}}, {“label”: “bench“, “confidence”: “0.57”, “bounding_box”: {“x1”: “122”, “y1”: “308”, “x2”: “604”, “y2”: “679”}}, {“label”: “bench”, “confidence”: “0.41”, “bounding_box”: {“x1”: “93”, “y1”: “119”, “x2”: “615”, “y2”: “665”}}, {“label”: “dining table”, “confidence”: “0.37”, “bounding_box”: {“x1”: “92”, “y1”: “115”, “x2”: “617”, “y2”: “667”}}, {“label”: “bed”, “confidence”: “0.35”, “bounding_box”: {“x1”: “98”, “y1”: “173”, “x2”: “617”, “y2”: “670”}}, {“label”: “bench”, “confidence”: “0.32”, “bounding_box”: {“x1”: “316”, “y1”: “537”, “x2”: “521”, “y2”: “671”}}, {“label”: “chair“, “confidence”: “0.3”, “bounding_box”: {“x1”: “149”, “y1”: “339”, “x2”: “583”, “y2”: “673”}}]
Status3: SC_OK

5 Nations

[{“label”: “truck“, “confidence”: “0.7”, “bounding_box”: {“x1”: “-14”, “y1”: “257”, “x2”: “1679”, “y2”: “1243”}}, {“label”: “train“, “confidence”: “0.62”, “bounding_box”: {“x1”: “537”, “y1”: “240”, “x2”: “1654”, “y2”: “1270”}}, {“label”: “person“, “confidence”: “0.59”, “bounding_box”: {“x1”: “341”, “y1”: “991”, “x2”: “412”, “y2”: “1194”}]

Distributed Code Schema

To discuss a distributed and dependent software we need a code schema as a template for example a routine which counts words distributed in a text:

function CountWords_(const subtxt: string; Txt: string): Integer;
begin
  if (Length(subtxt)=0) Or (Length(Txt)=0) Or (Pos(subtxt,Txt)=0) then
    result:= 0
  else
    result:= (Length(Txt)- Length(StringReplace(Txt,subtxt,'',
                              [rfReplaceAll]))) div Length(subtxt);
end;

We can show that the function itself uses other functions from libraries to fullfill the main topic as count or search words in a text, so data (words) and functions are distributed and dependent to each other.

Also a code contract as a precondition makes sure to get valuable data with an if statement:

function CountWords__(const subtxt:string; Txt:string): Integer;
begin
  if (len(subtxt)=0) or (len(Txt)=0) or (Pos(subtxt,Txt)=0) then
    result:= 0
  else
    result:= (len(Txt)- len(StringReplace(Txt,subtxt,'',
                              [rfReplaceAll]))) div len(subtxt);
end;

An optimization to operate with or and len in the function above.

function HTTPClassComponentAPIDetection2(AURL,askstream,aApikey:string): string;
var HttpReq1: THttpRequestC;
    Body: TMultipartFormBody;
    Body2: TUrlEncodedFormBody;  //ct: TCountryCode;
begin
  Body:= TMultipartFormBody.Create;
  Body.ReleaseAfterSend:= True;
  //Body.Add('code','2','application/octet-stream');
  Body.AddFromFile('image',
         'C:\maxbox\maxbox51\examples\TEE_5_Nations_20240402.jpg');
   
  HttpReq1:= THttpRequestC.create(self);
  HttpReq1.headers.add('X-Api-Key:'+AAPIKEY);
  HttpReq1.headers.add('Accept:application/json');
  HttpReq1.SecurityOptions:= [soSsl3, soPct, soIgnoreCertCNInvalid];
  try
    if HttpReq1.Post1Multipart(AURL, body) then 
       writeln(HttpReq1.Response.ContentAsString)
    else Writeln('APIError '+inttostr(HttpReq1.Response.StatusCode2));
  finally  
    writeln('Status3: '+gethttpcod(HttpReq1.Response.statuscode2))
    HttpReq1.Free;  
    sleep(200)
    // if assigned(body) then body.free;
  end; 
end;

Distributed Time App

10

DayTime protocol is not the NTP protocol. DayTime uses port 13, not 37. 37 is used by the Time protocol, which, again, in not the NTP protocol, which uses 123 (UDP). I do not know if time.windows.com supports the DayTime and Time protocols, the most common used protocols to get time from a reliable time source nowadays is NTP, and its simpler sibling SNTP, which superseded both DayTime and Time protocols.

var ledTimer2: TTimer; 
    ledLbl2: TLEDNumber;
    
procedure CloseClickCompact(Sender: TObject; var action: TCloseAction);
begin
  if ledTimer2 <> Nil then begin
    ledTimer2.enabled:= false;
    ledTimer2.Free;
    ledTimer2:= Nil;
  end;
  action:= caFree;
  writeln('compact timer form close at '+FormatDateTime('dd/mm/yyyy hh:nn:ss', Now));
end;
 
procedure updateLED2_event(sender: TObject);
begin
  ledLbl2.caption:= TimeToStr(time);
end;
 
procedure loadcompactTimeForm;
 var frm: TForm;
 begin
   frm:= TForm.create(self);
   frm.Setbounds(10,10,400, 300)
   frm.onclose:= @CloseClickCompact;
   frm.icon.loadfromresourcename(hinstance, 'TIMER2');
   frm.color:= clblack;
   frm.show;
   ledlbl2:= TLEDNumber.create(Frm)
   with ledlbl2 do begin
     Parent:= Frm;
     setBounds(35,140,350,100)
     caption:= TimeToStr(time);
     columns:= 10;
     size:= 3;
   end;
   ledtimer2:= TTimer.create(self);
   ledtimer2.interval:= 1000;
   ledtimer2.ontimer:= @updateLED2_event;   
 end;  
 
 
const
  ArgInstallUpdate = '/install_update';
  ArgRegisterExtension = '/register_global_file_associations';
   
procedure SetSynchroTime;
var mySTime: TIdSNTP;
begin
  mySTime:= TIdSNTP.create(self);
  try
    mySTime.host:='0.debian.pool.ntp.org';
    writeln('the internettime '+
       datetimetoStr(mystime.datetime));
    // needs to be admin & elevated 
    writeln('IsElevated '+ botostr(IsElevated)); 
    writeln(Format('IsUACEnabled: %s',[BoolToStr(IsUACEnabled, True)]));
    writeln('run elevated: '+itoa(SetLastError(RunElevated(ArgInstallUpdate, hinstance, nil))));//Application.ProcessMessages));
    if mySTime.Synctime then begin
      writeln('operating system sync now as admin & elevated!');
      Speak('System time is now sync with the internet time '+TimeToStr(time))
    end;  
  finally
    mySTime.free;
  end;
end;

TIdSNTP.SyncTime() uses the Win32 SetLocalTime() function, which requires the calling process to have the SE_SYSTEMTIME_NAME privilege present in its user token (even if it is not enabled, 
it just needs to be present – SetLocalTime() will enable it for the duration of the call).  
By default, only an elevated admin has that privilege present in its token.  
So, you will have to either run your whole app as an elevated user, or at least split out your sync code into a separate process or service that runs as an elevated admin or the SYSTEM account.

[{“label”: “airplane“, “confidence”: “0.67”, “bounding_box”: {“x1”: “52”, “y1”: “16”, “x2”: “840”, “y2”: “1251”}}, {“label”: “train“, “confidence”: “0.37”, “bounding_box”: {“x1”: “52”, “y1”: “16”, “x2”: “840”, “y2”: “1251”}}, {“label”: “person“, “confidence”: “0.36”, “bounding_box”: {“x1”: “470”, “y1”: “1080”, “x2”: “514”, “y2”: “1243”}}]

Locomotive LEMACO H0-097/2 SNCF CC 40108
Last Teaching List LTL
Virus Total Top VTT

https://sourceforge.net/projects/maxbox/files/Examples/13_General/984_Drawaclock_computergraphic64.txt/download

HAG Limitiert

[{“label”: “boat“, “confidence”: “0.64”, “bounding_box”: {“x1”: “392”, “y1”: “473”, “x2”: “1895”, “y2”: “883”}}, {“label”: “boat”, “confidence”: “0.63”, “bounding_box”: {“x1”: “65”, “y1”: “493”, “x2”: “262”, “y2”: “538”}}, {“label”: “boat“, “confidence”: “0.6”, “bounding_box”: {“x1”: “990”, “y1”: “468”, “x2”: “1910”, “y2”: “887”}}, {“label”: “boat“, “confidence”: “0.59”, “bounding_box”: {“x1”: “157”, “y1”: “731”, “x2”: “327”, “y2”: “780”}}, {“label”: “boat”, “confidence”: “0.57”, “bounding_box”: {“x1”: “72”, “y1”: “620”, “x2”: “180”, “y2”: “659”}}, {“label”: “boat“, “confidence”: “0.56”, “bounding_box”: {“x1”: “167”, “y1”: “743”, “x2”: “353”, “y2”: “786”}}, {“label”: “boat“, “confidence”: “0.54”, “bounding_box”: {“x1”: “328”, “y1”: “535”, “x2”: “388”, “y2”: “570”}}, {“label”: “person“, “confidence”: “0.54”, “bounding_box”: {“x1”: “1626”, “y1”: “632”, “x2”: “1680”, “y2”: “675”}}, {“label”: “person“, “confidence”: “0.52”, “bounding_box”: {“x1”: “1788”, “y1”: “666”, “x2”: “1841”, “y2”: “711”}}, {“label”: “boat“, “confidence”: “0.52”, “bounding_box”: {“x1”: “315”, “y1”: “756”, “x2”: “421”, “y2”: “791”}}, {“label”: “boat“, “confidence”: “0.52”, “bounding_box”: {“x1”: “347”, “y1”: “524”, “x2”: “390”, “y2”: “552”}}, {“label”: “person“, “confidence”: “0.51”, “bounding_box”: {“x1”: “1447”, “y1”: “645”, “x2”: “1484”, “y2”: “684”}}, {“label”: “boat”, “confidence”: “0.5”, “bounding_box”: {“x1”: “82”, “y1”: “613”, “x2”: “168”, “y2”: “642”}}, {“label”: “boat”, “confidence”: “0.5”, “bounding_box”: {“x1”: “921”, “y1”: “409”, “x2”: “986”, “y2”: “444”}}, {“label”: “boat”, “confidence”: “0.5”, “bounding_box”: {“x1”: “627”, “y1”: “503”, “x2”: “680”, “y2”: “531”}}, {“label”: “boat”, “confidence”: “0.48”, “bounding_box”: {“x1”: “180”, “y1”: “498”, “x2”: “266”, “y2”: “529”}}, {“label”: “bird“, “confidence”: “0.46”, “bounding_box”: {“x1”: “1626”, “y1”: “632”, “x2”: “1680”, “y2”: “675”}}, {“label”: “person”, “confidence”: “0.44”, “bounding_box”: {“x1”: “1915”, “y1”: “733”, “x2”: “1980”, “y2”: “779”}}, {“label”: “person”, “confidence”: “0.44”, “bounding_box”: {“x1”: “1950”, “y1”: “638”, “x2”: “1994”, “y2”: “689”}}, {“label”: “boat”, “confidence”: “0.44”, “bounding_box”: {“x1”: “326”, “y1”: “452”, “x2”: “370”, “y2”: “476”}}, {“label”: “person”, “confidence”: “0.43”, “bounding_box”: {“x1”: “1915”, “y1”: “747”, “x2”: “1987”, “y2”: “797”}}, {“label”: “boat”, “confidence”: “0.43”, “bounding_box”: {“x1”: “1123”, “y1”: “683”, “x2”: “1210”, “y2”: “728”}}, {“label”: “boat”, “confidence”: “0.42”, “bounding_box”: {“x1”: “648”, “y1”: “481”, “x2”: “675”, “y2”: “497”}}, {“label”: “person”, “confidence”: “0.42”, “bounding_box”: {“x1”: “1345”, “y1”: “646”, “x2”: “1378”, “y2”: “682”}}, {“label”: “boat”, “confidence”: “0.42”, “bounding_box”: {“x1”: “6”, “y1”: “494”, “x2”: “79”, “y2”: “534”}}, {“label”: “person”, “confidence”: “0.42”, “bounding_box”: {“x1”: “1771”, “y1”: “666”, “x2”: “1821”, “y2”: “710”}}, {“label”: “person”, “confidence”: “0.42”, “bounding_box”: {“x1”: “1538”, “y1”: “637”, “x2”: “1573”, “y2”: “679”}}, {“label”: “boat”, “confidence”: “0.42”, “bounding_box”: {“x1”: “825”, “y1”: “713”, “x2”: “1109”, “y2”: “898”}}, {“label”: “boat”, “confidence”: “0.4”, “bounding_box”: {“x1”: “318”, “y1”: “751”, “x2”: “388”, “y2”: “780”}}, {“label”: “person”, “confidence”: “0.39”, “bounding_box”: {“x1”: “1428”, “y1”: “672”, “x2”: “1455”, “y2”: “700”}}, {“label”: “boat”, “confidence”: “0.38”, “bounding_box”: {“x1”: “617”, “y1”: “410”, “x2”: “693”, “y2”: “439”}}, {“label”: “boat”, “confidence”: “0.38”, “bounding_box”: {“x1”: “118”, “y1”: “481”, “x2”: “173”, “y2”: “499”}}, {“label”: “boat”, “confidence”: “0.38”, “bounding_box”: {“x1”: “407”, “y1”: “764”, “x2”: “449”, “y2”: “784”}}, {“label”: “person”, “confidence”: “0.38”, “bounding_box”: {“x1”: “1196”, “y1”: “726”, “x2”: “1251”, “y2”: “783”}}, {“label”: “boat”, “confidence”: “0.38”, “bounding_box”: {“x1”: “15”, “y1”: “602”, “x2”: “69”, “y2”: “630”}}, {“label”: “bird”, “confidence”: “0.38”, “bounding_box”: {“x1”: “1737”, “y1”: “650”, “x2”: “1779”, “y2”: “689”}}, {“label”: “boat”, “confidence”: “0.37”, “bounding_box”: {“x1”: “9”, “y1”: “288”, “x2”: “1990”, “y2”: “855”}}, {“label”: “boat”, “confidence”: “0.37”, “bounding_box”: {“x1”: “7”, “y1”: “552”, “x2”: “61”, “y2”: “583”}}, {“label”: “bird”, “confidence”: “0.37”, “bounding_box”: {“x1”: “627”, “y1”: “503”, “x2”: “680”, “y2”: “531”}}, {“label”: “umbrella”, “confidence”: “0.37”, “bounding_box”: {“x1”: “1123”, “y1”: “683”, “x2”: “1210”, “y2”: “728”}}, {“label”: “boat”, “confidence”: “0.37”, “bounding_box”: {“x1”: “751”, “y1”: “801”, “x2”: “942”, “y2”: “887”}}, {“label”: “person”, “confidence”: “0.36”, “bounding_box”: {“x1”: “72”, “y1”: “620”, “x2”: “180”, “y2”: “659”}}, {“label”: “person”, “confidence”: “0.36”, “bounding_box”: {“x1”: “328”, “y1”: “535”, “x2”: “388”, “y2”: “570”}}, {“label”: “boat”, “confidence”: “0.36”, “bounding_box”: {“x1”: “242”, “y1”: “481”, “x2”: “289”, “y2”: “499”}}, {“label”: “boat”, “confidence”: “0.36”, “bounding_box”: {“x1”: “74”, “y1”: “413”, “x2”: “1478”, “y2”: “838”}}, {“label”: “boat”, “confidence”: “0.36”, “bounding_box”: {“x1”: “550”, “y1”: “467”, “x2”: “573”, “y2”: “480”}}, {“label”: “person”, “confidence”: “0.36”, “bounding_box”: {“x1”: “1351”, “y1”: “666”, “x2”: “1383”, “y2”: “705”}}, {“label”: “boat”, “confidence”: “0.35”, “bounding_box”: {“x1”: “826”, “y1”: “408”, “x2”: “868”, “y2”: “440”}}, {“label”: “bird”, “confidence”: “0.35”, “bounding_box”: {“x1”: “1771”, “y1”: “666”, “x2”: “1821”, “y2”: “710”}}, {“label”: “person”, “confidence”: “0.35”, “bounding_box”: {“x1”: “921”, “y1”: “409”, “x2”: “986”, “y2”: “444”}}, {“label”: “person”, “confidence”: “0.34”, “bounding_box”: {“x1”: “80”, “y1”: “461”, “x2”: “1895”, “y2”: “865”}}, {“label”: “person”, “confidence”: “0.34”, “bounding_box”: {“x1”: “347”, “y1”: “524”, “x2”: “390”, “y2”: “552”}}, {“label”: “boat”, “confidence”: “0.34”, “bounding_box”: {“x1”: “1126”, “y1”: “760”, “x2”: “1856”, “y2”: “894”}}, {“label”: “boat”, “confidence”: “0.34”, “bounding_box”: {“x1”: “737”, “y1”: “403”, “x2”: “782”, “y2”: “430”}}, {“label”: “bird”, “confidence”: “0.34”, “bounding_box”: {“x1”: “1788”, “y1”: “666”, “x2”: “1841”, “y2”: “711”}}, {“label”: “person”, “confidence”: “0.34”, “bounding_box”: {“x1”: “407”, “y1”: “764”, “x2”: “449”, “y2”: “784”}}, {“label”: “person”, “confidence”: “0.34”, “bounding_box”: {“x1”: “82”, “y1”: “613”, “x2”: “168”, “y2”: “642”}}, {“label”: “boat”, “confidence”: “0.34”, “bounding_box”: {“x1”: “29”, “y1”: “485”, “x2”: “81”, “y2”: “503”}}, {“label”: “boat”, “confidence”: “0.34”, “bounding_box”: {“x1”: “457”, “y1”: “422”, “x2”: “476”, “y2”: “439”}}, {“label”: “boat”, “confidence”: “0.33”, “bounding_box”: {“x1”: “540”, “y1”: “419”, “x2”: “564”, “y2”: “438”}}, {“label”: “person”, “confidence”: “0.33”, “bounding_box”: {“x1”: “345”, “y1”: “820”, “x2”: “397”, “y2”: “841”}}, {“label”: “person”, “confidence”: “0.33”, “bounding_box”: {“x1”: “315”, “y1”: “756”, “x2”: “421”, “y2”: “791”}}, {“label”: “boat”, “confidence”: “0.33”, “bounding_box”: {“x1”: “1233”, “y1”: “419”, “x2”: “1273”, “y2”: “437”}}, {“label”: “person”, “confidence”: “0.33”, “bounding_box”: {“x1”: “1359”, “y1”: “587”, “x2”: “1397”, “y2”: “618”}}, {“label”: “bird”, “confidence”: “0.33”, “bounding_box”: {“x1”: “1123”, “y1”: “683”, “x2”: “1210”, “y2”: “728”}}, {“label”: “person”, “confidence”: “0.33”, “bounding_box”: {“x1”: “1952”, “y1”: “652”, “x2”: “1995”, “y2”: “701”}}, {“label”: “boat”, “confidence”: “0.33”, “bounding_box”: {“x1”: “981”, “y1”: “415”, “x2”: “1023”, “y2”: “436”}}, {“label”: “boat”, “confidence”: “0.32”, “bounding_box”: {“x1”: “1164”, “y1”: “701”, “x2”: “1782”, “y2”: “858”}}, {“label”: “person”, “confidence”: “0.32”, “bounding_box”: {“x1”: “1737”, “y1”: “650”, “x2”: “1779”, “y2”: “689”}}, {“label”: “person”, “confidence”: “0.32”, “bounding_box”: {“x1”: “839”, “y1”: “745”, “x2”: “892”, “y2”: “807”}}, {“label”: “boat”, “confidence”: “0.32”, “bounding_box”: {“x1”: “1233”, “y1”: “607”, “x2”: “1632”, “y2”: “847”}}, {“label”: “person”, “confidence”: “0.32”, “bounding_box”: {“x1”: “1204”, “y1”: “746”, “x2”: “1326”, “y2”: “802”}}, {“label”: “bird”, “confidence”: “0.31”, “bounding_box”: {“x1”: “1950”, “y1”: “638”, “x2”: “1994”, “y2”: “689”}}, {“label”: “boat”, “confidence”: “0.31”, “bounding_box”: {“x1”: “1154”, “y1”: “417”, “x2”: “1208”, “y2”: “433”}}, {“label”: “boat”, “confidence”: “0.31”, “bounding_box”: {“x1”: “442”, “y1”: “455”, “x2”: “463”, “y2”: “474”}}, {“label”: “person”, “confidence”: “0.31”, “bounding_box”: {“x1”: “157”, “y1”: “731”, “x2”: “327”, “y2”: “780”}}, {“label”: “bird”, “confidence”: “0.31”, “bounding_box”: {“x1”: “877”, “y1”: “536”, “x2”: “905”, “y2”: “557”}}, {“label”: “person”, “confidence”: “0.31”, “bounding_box”: {“x1”: “1210”, “y1”: “670”, “x2”: “1259”, “y2”: “715”}}, {“label”: “person”, “confidence”: “0.31”, “bounding_box”: {“x1”: “167”, “y1”: “743”, “x2”: “353”, “y2”: “786”}}, {“label”: “boat”, “confidence”: “0.31”, “bounding_box”: {“x1”: “3”, “y1”: “619”, “x2”: “64”, “y2”: “655”}}, {“label”: “person”, “confidence”: “0.3”, “bounding_box”: {“x1”: “326”, “y1”: “452”, “x2”: “370”, “y2”: “476”}}, {“label”: “person”, “confidence”: “0.3”, “bounding_box”: {“x1”: “627”, “y1”: “503”, “x2”: “680”, “y2”: “531”}}, {“label”: “boat”, “confidence”: “0.3”, “bounding_box”: {“x1”: “366”, “y1”: “534”, “x2”: “414”, “y2”: “561”}}, {“label”: “boat”, “confidence”: “0.3”, “bounding_box”: {“x1”: “643”, “y1”: “719”, “x2”: “1101”, “y2”: “879”}}]
Status3: SC_OK

API Board

JSON Iterate

function HTTPClassComponentAPIDetection2(AURL,askfile,aApikey:string): string;
var HttpReq1: THttpRequestC;
    Body: TMultipartFormBody;
    Body2: TUrlEncodedFormBody;  //ct: TCountryCode;
    jo, locate: TJSON;  jobj:TJsonArray2; res: string; i, cnt:integer;

begin
  Body:= TMultipartFormBody.Create;
  Body.ReleaseAfterSend:= True;
  //Body.Add('code','2','application/octet-stream');
  Body.AddFromFile('image', askfile)
  HttpReq1:= THttpRequestC.create(self);
  HttpReq1.headers.add('X-Api-Key:'+AAPIKEY);
  HttpReq1.headers.add('Accept:application/json');
  HttpReq1.SecurityOptions:= [soSsl3, soPct, soIgnoreCertCNInvalid];
  try
    if HttpReq1.Post1Multipart(AURL, body) then begin
        res:= HttpReq1.Response.ContentAsString;
        //StrReplace(res, '[{', '{');
        jo:= TJSON.create;
        jo.parse(res);
        jobj:= jo.jsonArray;
        writeln('items found: '+itoa(jobj.count))
        //StrReplace(res, '{', '[{');
       //  cnt:= jobj.values['public_holidays'].asobject['label'].asarray.count;
       for i:= 0 to jobj.count-1 do 
          writeln(itoa(i)+':'+jobj[i].asobject['label'].asstring+' conf %: '+
                              jobj[i].asobject['confidence'].asstring);  
        //jo.values['public_holidays'].asobject['list'].asarray[it].asobject['description'].asstring

    end else Writeln('APIError '+inttostr(HttpReq1.Response.StatusCode2));
  finally 
    writeln('Status3: '+gethttpcod(HttpReq1.Response.statuscode2))
    HttpReq1.Free;  
    sleep(200)
    // if assigned(body) then body.free;
    jo.free;
  end; 
end;

items found: 6
0:person conf %: 0.46
1:person conf %: 0.38
2:boat conf %: 0.37
3:person conf %: 0.33
4:train conf %: 0.32
5:boat conf %: 0.31
Status3: SC_OK

Klagenfurt

Version 5.1.4.98 History

V5.1.4.98 II

1 DBXCharDecoder_

2 IdModBusClient

3 IdModBusServer

4 IdStrings9

5 JclGraphUtils_

6 JvSegmentedLEDDisplay

7 JvSegmentedLEDDisplayMapperFrame

8 ModbusConsts

9 ModbusTypes

10 SynCrtSock

11 SynWinSock

12 uPSI_DBXCharDecoder

13 uPSI_IdModBusClient

14 uPSI_IdModBusServer

15 uPSI_JvSegmentedLEDDisplayMapperFrame

16 uPSI_SynCrtSock

17 uPSI_xrtl_util_CPUUtils

18 uPSI_HttpClasses.pas

19 uPSI_HttpUtils.pas

20 HttpClasses.pas

21 HttpUtils.pas

V5.1.4.98 III
GLCanvas
GLNavigator
GLParticles
GLStarRecord
JclComplex
uPSI_GLCanvas
uPSI_GLNavigator
uPSI_GLParticles
uPSI_GLSilhouette
uPSI_GLStarRecord
uPSI_JclComplex

V5.1.4.98 IV
ALAVLBinaryTree2
ALCommon

(ALExecute2)

ALFBXBase
ALFBXClient
ALFBXConst
ALFBXError
ALFBXLib
ALString_
ALWebSpider
uPSI_ALFBXClient
uPSI_ALFBXLib
uPSI_ALWebSpider

V5.1.4.98 V
ALFcnSQL
AlMySqlClient
ALMySqlWrapper
PJCBView
uPSI_ALFcnCGI
uPSI_ALFcnSQL
uPSI_AlMySqlClient
uPSI_DataSetUtils

V5.1.4.98 VII – 3705 Units
ESBDates
GpTimezone
hhAvALT
JclPCRE2
maXbox5_14beta90
SqlTxtRtns
uPSI_GpTimezone
uPSI_hhAvALT
uPSI_JclPCRE2
uPSI_SqlTxtRtns

V5.1.4.98 VIII – 3714 Units
AsciiShapes
IdWebSocketSimpleClient
uPSI_AsciiShapes
uPSI_IdWebSocketSimpleClient
uPSI_uWebUIMiscFunctions

uWebUIConstants
uWebUILibFunctions
uWebUIMiscFunctions
uWebUITypes

V5.1.4.98 IX – 3720 Units
ExecuteGLPanel
ExecuteidWebSocket
uPSI_ExecuteGLPanel
uPSI_ExecuteidWebSocket
Winapi.OpenGL
Winapi.OpenGLext

25.05.2024 17:18

Release Notes maXbox 5.1.4.98 IX May 2024 Ocean950
SHA1: 5.1.4.98 IX maXbox5.exe d5e5728f0dbfe563ffb8960341cee4949aa6fa31
SHA1: ZIP maxbox5.zip ABF343E710050CC4C3C0276E2815F7C908C8DC6E
https://archive.org/details/maxbox5

APILayer Image2Text

URL_APILAY = ‘https://api.apilayer.com/image_to_text/url?url=%s&#8217;; 

maXbox5 1176_APILayer_Demo64ipso.txt Compiled done: 06/06/2024 15:40:24

debug size: 10146
{“lang”: “de”, “all_text”: “Elisabethenanlage 9

  1. adesso
  2. ti&m
  3. X HHM
    6.
    Schulungsräume
    601-608
    business.
    people.
    technology
    5.
    Schulungsräume
    501-509
  4. Schulungsräume
    401-409
    3.
    Schulungsräume
    301-309
  5. ipso Bildung
    Empfang
    seit 1876
  6. Schulverwaltung
    Cafeteria
    NSH ipso
    WIMG
    IBZ ipso
    BILDUNGS
    Business
    ZENTRUM School
    De Server Schula
    Teksture Manage
    Executive
    Education
    EG Praxisraum Zahnmedizin
    Zimmer 001, 003, 004, 005
    Zugang über 1. OG
    HWS
    Haber Widemann Schule”, “annotations”: [“Elisabethenanlage”, “9”, “9.”, “adesso”, “8.”, “ti”, “&”, “m”, “7.”, “X”, “HHM”, “6”, “.”, “Schulungsräume”, “601-608”, “business”, “.”, “people”, “.”, “technology”, “5”, “.”, “Schulungsräume”, “501-509”, “4.”, “Schulungsräume”, “401-409”, “3”, “.”, “Schulungsräume”, “301-309”, “2.”, “ipso”, “Bildung”, “Empfang”, “seit”, “1876”, “1.”, “Schulverwaltung”, “Cafeteria”, “NSH”, “ipso”, “WIMG”, “IBZ”, “ipso”, “BILDUNGS”, “Business”, “ZENTRUM”, “School”, “De”, “Server”, “Schula”, “Teksture”, “Manage”, “Executive”, “Education”, “EG”, “Praxisraum”, “Zahnmedizin”, “Zimmer”, “001”, “,”, “003”, “,”, “004”, “,”, “005”, “Zugang”, “über”, “1.”, “OG”, “HWS”, “Haber”, “Widemann”, “Schule”]}

mX5🐞 executed: 06/06/2024 15:40:25 Runtime: 0:0:2.703 Memload: 57% use
RemObjects Pascal Script. Copyright (c) 2004-2024 by RemObjects Software & maXbox5 Ver: 5.1.4.98 (514). Workdir: C:\Program Files\Streaming\IBZ2021\Module2_3\EKON26

Request served by 1781505b56ee58

GET / HTTP/1.1

Host: echo.websocket.org
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7
Accept-Encoding: gzip, deflate, br, zstd
Accept-Language: de,de-DE;q=0.9,en;q=0.8,en-GB;q=0.7,en-US;q=0.6
Cache-Control: max-age=0
Fly-Client-Ip: 46.127.119.188
Fly-Forwarded-Port: 443
Fly-Forwarded-Proto: https
Fly-Forwarded-Ssl: on
Fly-Region: ams
Fly-Request-Id: 01HZSKBQQV29WH8KMWQSNSXPTH-ams
Fly-Traceparent: 00-3b53b2a0255e5b2b5fd02edf188828ba-18334a5211f33c03-00
Fly-Tracestate: 
Priority: u=0, i
Sec-Ch-Ua: "Microsoft Edge";v="125", "Chromium";v="125", "Not.A/Brand";v="24"
Sec-Ch-Ua-Mobile: ?0
Sec-Ch-Ua-Platform: "Windows"
Sec-Fetch-Dest: document
Sec-Fetch-Mode: navigate
Sec-Fetch-Site: none
Sec-Fetch-User: ?1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/125.0.0.0 Safari/537.36 Edg/125.0.0.0
Via: 2 fly.io
X-Forwarded-For: 46.127.119.188, 66.241.124.119
X-Forwarded-Port: 443
X-Forwarded-Proto: https
X-Forwarded-Ssl: on
X-Request-Start: t=1717772345083184
Multicore
Mutlicore 8 Cores 64-bit mX5

https://sourceforge.net/projects/maxbox/files/Examples/13_General/1297_cpu_core_information.txt/download

CPU Multicore Script

https://sourceforge.net/projects/maxbox/files/Examples/13_General/630_multikernel4_8_64.TXT/download

—————————————–

Agenda Mod V 13/06/2024

—————————————–
– 08:30 Begrüssung

– Rekapitulation agsi-data-storage-api/
—               1299_odbc_ipso64_21ict/  

– 08:40 Kap 5     

—  Lecture Distributed Multimedia Systems Finish

—  Script 1 http://www.softwareschule.ch/examples/surprise.txt

—  Script 2 WordWheel1.htm performance Test 

– 10:00 Uhr Pause

– 10:15 Vorstellung ICT_IT_DistSys_Modulpruefung_321_2024_BA.pdf

—  Script 3 1299_odbc_ipso64_21ict.txt

– 11:00 Start Modulpruefung

– 16:00 Abgabe Modulpruefung Elektronisch

maXbox Agenda
Crypto Box
Distributed Software Development

Start with maXbox5 ImageAI Detector —>
this first line fine
person : 99.97
person : 99.98
person : 99.87
person : 99.83
person : 99.87
person : 99.78
person : 99.51
person : 99.82
integrate image detector compute ends…

elapsedSeconds:= 4.415644700000
debug: 8-I/O error 105 0 err:20
no console attached..
mX5🐞 executed: 13/06/2024 22:27:36 Runtime: 0:0:7.308 Memload: 66% use

Roco meets HAG

{
“lang”: “en”,
“all_text”: “Video\nC000\nCBL 140703\nBitte Halbtax-Abo”,
“annotations”: [
“Video”,
“C000”,
“CBL”,
“140703”,
“Bitte”,
“Halbtax”,
“-“,
“Abo”
]
}

[
{
"label": "person",
"confidence": "0.66",
"bounding_box": {
"x1": "629",
"y1": "1075",
"x2": "1232",
"y2": "1895"
}
},
{
"label": "book",
"confidence": "0.56",
"bounding_box": {
"x1": "214",
"y1": "1234",
"x2": "560",
"y2": "1518"
}
},
{
"label": "laptop",
"confidence": "0.53",
"bounding_box": {
"x1": "210",
"y1": "1232",
"x2": "555",
"y2": "1514"
}
},
{
"label": "chair",
"confidence": "0.53",
"bounding_box": {
"x1": "629",
"y1": "1075",
"x2": "1232",
"y2": "1895"
}
},
{
"label": "tv",
"confidence": "0.51",
"bounding_box": {
"x1": "519",
"y1": "330",
"x2": "941",
"y2": "609"
}
},
mX5 codemap copilot
Updated Pandas 2.2.2

Status3: SC_OK
JSONback [{
“label”: “dining table”,
“confidence”: “0.33”,
“bounding_box”: {
“x1”: “0”,
“y1”: “2”,
“x2”: “639”,
“y2”: “475”
}
}
]

Live Demo
function HTTPClsComponentAPIDetection2(AURL, askstream, aApikey, afile: string): string;
var HttpReq1: THttpRequestC;
    Body: TMultipartFormBody;
    Body2: TUrlEncodedFormBody;  //ct: TCountryCode;
begin
  Body:= TMultipartFormBody.Create;
  Body.ReleaseAfterSend:= True;
  //Body.Add('code','2','application/octet-stream');
  Body.AddFromFile('image', afile);
    //test 'C:\maxbox\maxbox51\examples\blason_klagenfurt_20240527_114128_resized.jpg');
  
  HttpReq1:= THttpRequestC.create(self);
  HttpReq1.headers.add('X-Api-Key:'+AAPIKEY);
  HttpReq1.headers.add('Accept:application/json');
  HttpReq1.SecurityOptions:= [soSsl3, soPct, soIgnoreCertCNInvalid];
  try
    if HttpReq1.Post1Multipart(AURL, body) then 
       result:=HttpReq1.Response.ContentAsString
    else Writeln('APIError '+inttostr(HttpReq1.Response.StatusCode2));
  finally  
    writeln('Status3: '+gethttpcod(HttpReq1.Response.statuscode2))
    HttpReq1.Free;  
    sleep(200)
  end; 
end;

URL_NINJA_QRCODE = 'https://api.api-ninjas.com/v1/qrcode?format=%s&data=%s';

function GEO_getQRCode(AURL, aformat, adata, aApikey: string): string;
var httpq: THttpConnectionWinInet;
    rets: TMemoryStream;  
    heads: TStrings; iht: IHttpConnection;
begin
  httpq:= THttpConnectionWinInet.Create(true); 
  rets:= TMemoryStream.create;
  heads:= TStringlist.create;     
  try 
    heads.add('X-Api-Key='+aAPIkey);
    heads.add('Accept= image/jpg');
    iht:= httpq.setHeaders(heads);
    httpq.Get(Format(AURL,[aformat, adata]), rets);
    if httpq.getresponsecode=200 Then begin
       writeln('size of '+itoa(rets.size));
       rets.Position:= 0;
       //ALMimeBase64decodeStream(rets, rets2)
       rets.savetofile((exepath+'qrcodeimage.jpg'));
       openfile(exepath+'qrcodeimage.jpg');
     end  
      else result:='Failed:'+
             itoa(Httpq.getresponsecode)+Httpq.GetResponseHeader('message');
  except  
    writeln('EWI_HTTP: '+ExceptiontoString(exceptiontype,exceptionparam));
  finally
    httpq:= Nil;
    heads.Free;
    rets.Free;
  end;                  
end;                 //}


GEO_getQRCode(URL_NINJA_QRCODE, 'jpg', 'maXbox5', N_APIKEY);

from script: C:\maxbox\maxbox51\examples\1307_APILayer_Demo64_5_httprequestC2.txt

4 ML Solutions with Locomotion

The first solution starts with tiny-yolov3.pt model from imagai:

'#using the pre-trained TinyYOLOv3 model,                          
detector.setModelTypeAsTinyYOLOv3()                               
detector.setModelPath(model_path)                                 
'#loads model path specified above using setModelPath() class method.   
detector.loadModel()                                                      
custom=detector.CustomObjects(person=True,laptop=True,car=False,train=True, clock=True, chair=False, bottle=False, keyboard=True)

Start with maXbox5 ImageAI Detector —>
this first line fine
train : 80.25
integrate image detector compute ends…

elapsedSeconds:= 4.879268800000
no console attached..
mX5🐞 executed: 29/07/2024 09:53:49 Runtime: 0:0:8.143 Memload: 75% use

The we asked why the model can’t see the persons? It depends on the frame by cutting the image (crop) it sees persons but no train anymore!

image cropped

input_path=r”C:\maxbox\maxbox51\examples\1316_elsass_20240728_161420crop.jpg”

Start with maXbox5 ImageAI Detector —>
this first line fine
person : 99.29
person : 99.58
person : 98.74
integrate image detector compute ends…

elapsedSeconds:= 4.686975000000
no console attached..
mX5🐞 executed: 29/07/2024 10:09:30 Runtime: 0:0:7.948 Memload: 77% use

You can see one false positive in green the bounding box!

The Second Solution is an API from URL_APILAY_DETECT = ‘https://api.api-ninjas.com/v1/objectdetection/&#8217;;

The Object Detection API provides fast and accurate image object recognition using advanced neural networks developed by machine learning experts.

https://api-ninjas.com/api/objectdetection

function TestHTTPClassComponentAPIDetection2(AURL, askstream, aApikey: string): string;
var HttpReq1: THttpRequestC;
    Body: TMultipartFormBody;
    Body2: TUrlEncodedFormBody;  //ct: TCountryCode;
begin
  Body:= TMultipartFormBody.Create;
  Body.ReleaseAfterSend:= True;
  //Body.Add('code','2','application/octet-stream');
  //Body.AddFromFile('image', exepath+'randimage01.jpg');
  Body.AddFromFile('image',
                           'C:\maxbox\maxbox51\examples\1316_elsass_20240728_161420_resized.jpg');
  
  HttpReq1:= THttpRequestC.create(self);
  HttpReq1.headers.add('X-Api-Key:'+AAPIKEY);
  HttpReq1.headers.add('Accept:application/json');
  HttpReq1.SecurityOptions:= [soSsl3, soPct, soIgnoreCertCNInvalid];
  try
    if HttpReq1.Post1Multipart(AURL, body) then 
       result:=HttpReq1.Response.ContentAsString
    else Writeln('APIError '+inttostr(HttpReq1.Response.StatusCode2));
  finally  
    writeln('Status3: '+gethttpcod(HttpReq1.Response.statuscode2))
    HttpReq1.Free;  
    sleep(200)
    // if assigned(body) then body.free;
  end; 
end;

This is a post from a multipartform body stream and you need an API key, then the resule is a JSON back:

Status3: SC_OK
back [ {
"label": "train",
"confidence": "0.76",
"bounding_box": {
"x1": "-6",
"y1": "291",
"x2": "1173",
"y2": "1347"
}
},
{
"label": "person",
"confidence": "0.72",
"bounding_box": {
"x1": "535",
"y1": "854",
"x2": "815",
"y2": "1519"
}
},
{
"label": "person",
"confidence": "0.69",
"bounding_box": {
"x1": "823",
"y1": "790",
"x2": "1055",
"y2": "1350"
}
},
as JSON back

The third solution wants to get the text back from the image. The Image to Text API detects and extracts text from images using state-of-the-art optical character recognition (OCR) algorithms. It can detect texts of different sizes, fonts, and even handwriting.

function Image_to_text_API2(AURL, url_imgpath, aApikey: string): string;
var httpq: THttpConnectionWinInet;
    rets: TStringStream;  
    heads: TStrings; iht: IHttpConnection; //losthost:THTTPConnectionLostEvent;
begin
  httpq:= THttpConnectionWinInet.Create(true); 
  rets:= TStringStream.create('');
  heads:= TStringlist.create;     
  try 
    heads.add('apikey='+aAPIkey);
    iht:= httpq.setHeaders(heads);
    httpq.Get(Format(AURL,[url_imgpath]), rets);
    if httpq.getresponsecode=200 Then result:= rets.datastring
      else result:='Failed:'+
             itoa(Httpq.getresponsecode)+Httpq.GetResponseHeader('message');
  except  
    writeln('EWI_HTTP: '+ExceptiontoString(exceptiontype,exceptionparam));
  finally
    httpq:= Nil;
    heads.Free;
    rets.Free;
  end;                  
end;                 //}

And the model is available to read the name of the Locomotive:

{“lang”:”en”,”all_text”:”18130\nBERTHOLD”,”annotations”:[“18130″,”BERTHOLD”]}
mX5🐞 executed: 29/07/2024 11:04:12 Runtime: 0:0:3.527 Memload: 81% use

The forth and last solution in this machine learning package is a Python one as in Python for maXbox or Python4Delphi available:

procedure PyCode(imgpath, apikey: string);
begin
  with TPythonEngine.Create(Nil) do begin
  //pythonhome:= 'C:\Users\breitsch\AppData\Local\Programs\Python\Python37-32\';
  try
    loadDLL;
    autofinalize:= false;
    ExecString('import requests');
    ExecStr('url= "https://api.apilayer.com/image_to_text/url?url='+imgpath+'"'); 
    ExecStr('payload = {}');  
    ExecStr('headers= {"apikey": "'+apikey+'"}'); 
    Println(EvalStr('requests.request("GET",url, headers=headers, data=payload).text'));  
  except
    raiseError;        
  finally       
    free;
  end; 
 end;
end; 

{“lang”: “en”, “all_text”: “18130\nBERTHOLD”, “annotations”: [“18130”, “BERTHOLD”]}

Version: 3.12.4 (tags/v3.12.4:8e8a4ba, Jun 6 2024, 19:30:16) [MSC v.1940 64 bit (AMD64)]
mX5🐞 executed: 29/07/2024 11:18:13 Runtime: 0:0:4.60 Memload: 79% use

S.A.C.M. Elsässische Maschinenbauanstalt Graffenstaden C-Kuppler (2x)
– Baujahr 1900

Built with simplicity in mind, ImageAI supports a list of state-of-the-art Machine Learning algorithms for image prediction, custom image prediction, object detection, video detection, video object tracking and image predictions trainings. ImageAI currently supports image prediction and training using 4 different Machine Learning algorithms trained on the ImageNet-1000 dataset. ImageAI also supports object detection, video detection and object tracking using RetinaNet, YOLOv3 and TinyYOLOv3 trained on COCO dataset. Finally, ImageAI allows you to train custom models for performing detection and recognition of new objects.

OCR with Neural Net

This API recognizes and reads a text embedded in pictures or photos. Image to Text API uses a neural net (LSTM) based OCR engine which is focused on line recognition, but also supports recognizing the character patterns. It supports both handwriting and printed materials as well as street maps.

APILayer is an API marketplace where also your API can reach a broader audiences, but first you need an API-key for free:

https://api-ninjas.com/api/imagetotext

First we need a picture to analyze, it was a photo for which I hadn’t the time to read the content:

Latinum Museum at Neuchâtel

We use WinHttp.WinHttpRequestJSONObjects and TGraphics library with loading and testing the REST-client. Also we pass the API-key as a request-header, so get a key first at: https://apilayer.com/marketplace

Then you must upload your image and put the link in a constant for passing at the API:

Const 
  URLIMAGEPATH3 = 'https://breitschtv.wordpress.com/wp-content/uploads/2024/04/scripttest_latin_20240316_133552.jpg?w=768';

The data represents is JSON data with all the text extracted and even the language of the text to scan is auto detected. Before we dive into code this is the main part of the script:

function Image_to_text_API2(AURL, url_imgpath, aApikey: string): string;
var httpq: THttpConnectionWinInet;
    rets: TStringStream;  
    heads: TStrings; iht: IHttpConnection; //losthost:THTTPConnectionLostEvent;
begin
  httpq:= THttpConnectionWinInet.Create(true); 
  rets:= TStringStream.create('');
  heads:= TStringlist.create;     
  try 
    heads.add('apikey='+aAPIkey);
    iht:= httpq.setHeaders(heads);
    httpq.Get(Format(AURL,[url_imgpath]), rets);
    if httpq.getresponsecode=200 Then result:= rets.datastring
      else result:='Failed:'+
             itoa(Httpq.getresponsecode)+Httpq.GetResponseHeader('message');
  except  
    writeln('EWI_HTTP: '+ExceptiontoString(exceptiontype,exceptionparam));
  finally
    httpq:= Nil;
    heads.Free;
    rets.Free;
  end;                  
end;                 //}

When you first call the API all the JSON encodings like \u0027 and \n or d\u00e9tecteur results (A JSON string must be double-quoted), so you need a stringify or parseJsonValue function to correct or escape d\u00e9tecteur to détecteur:

Or can you guess what’s this is: Il s’agit de d\u00e9p\u00f4ts\nd’objets m\u00e9talliques

Answer: Il s’agit de dépôts
d’objets métalliques, dont l’existence est largement attestée à l’âge du Bronze.

backstr:= parseJsonvalue(Image_to_text_API2(URL_APILAY, 
URLIMAGEPATH3, ‘dy5L70eQx72794XBZ8sewEgYTZR85_your APIKey‘));

@main call

The API itself is simple and straightforward:

URL_APILAY = 'https://api.apilayer.com/image_to_text/url?url=%s'; 

At a last line we fix the /n (depends on your Operating System or Language) to get a clear carriage and line-feed:

var backstr, validtext: string;

backstr:= parseJsonvalue(Image_to_text_API2(URL_APILAY, 
                         URLIMAGEPATH3,  
                         'YTZR85_your APIKey'));

validtext:= StringReplace(backstr, '\n',CR+LF,[rfReplaceAll]); 
   
writeln(validtext);                               

If your programming language is not listed in the Code Example above, you can still make API calls by using a HTTP request library written in your programming language and following the above documentation.

This is an example in Python (Python4Delphi):

procedure PyCode(imgpath: string);
begin
  with TPythonEngine.Create(Nil) do begin
  pythonhome:= 'C:\Users\breitsch\AppData\Local\Programs\Python\Python37-32\';
  try
    loadDLL;
    ExecString('import requests');
    ExecStr('url= "https://api.apilayer.com/image_to_text/url?url='+imgpath+'"'); 
    ExecStr('payload = {}');  
    ExecStr('headers= {"apikey": "dy5L70eQx72794XBZ8sewEgYTZR85_yourAPIKey"}'); 
    Println(EvalStr('requests.request("GET",url, headers=headers, data=payload).text'));  
  except
    raiseError;        
  finally       
    free;
  end; 
 end;
end; 

And in Real Python:

import requests

api_url = 'https://api.api-ninjas.com/v1/imagetotext'
image_file_descriptor = open('YOUR_IMAGE.jpeg', 'rb')
files = {'image': image_file_descriptor}
r = requests.post(api_url, files=files)
print(r.json())

Image2Text or Image to Text live demo is providing an API service on its APILayer publication platform. Live Demo feature allows you to test the API within your browser; no need to install or code anything. You can modify all the parameters as you like and interact with the API from many languages.

The result can be (99.97%):

{"lang":"fr","all_text":"TSAPIENS
Des rives du lac au
pied du Chasseral:
découvertes
inédites de dépôts
de l'âge du Bronze
10
La collaboration initiée depuis quelques années entre la section Archéologie de l'Office du patrimoine et de l'archéologie du canton de
Neuchâtel (OPAN) et des prospecteurs amateurs a conduit à la découverte d'un type de sites rare dans notre région. Il s'agit de dépôts
d'objets métalliques, dont l'existence est largement attestée à l'âge du Bronze à l'échelle européenne. Ces découvertes manifestent
une complémentarité exemplaire entre professionnels et amateurs d'archéologie dans le cadre de projets explorant l'occupation du
territoire neuchitelois à travers le temps.
LS Models & HAG
Code AI Generator Challenge
V 5.1.4.95 Release
\maxbox5\1234_NeuralNetwork2_XOR_sampleEU_EKON27_64.pas File stored
Liliput 1042 Vintage
Liliput 1042
CC 6500 50 Ans

QR Code Generator API

The QRCode API generates custom QR codes for any content. It supports custom colors, size, and several image formats.

function GEO_getQRCode(AURL, aformat, adata, aApikey: string): string;
var httpq: THttpConnectionWinInet;
    rets: TMemoryStream;  
    heads: TStrings; iht: IHttpConnection;
begin
  httpq:= THttpConnectionWinInet.Create(true); 
  rets:= TMemoryStream.create;
  heads:= TStringlist.create;     
  try 
    heads.add('X-Api-Key='+aAPIkey);
    heads.add('Accept= image/png');
    iht:= httpq.setHeaders(heads);
    httpq.Get(Format(AURL,[aformat, adata]), rets);
    if httpq.getresponsecode=200 Then begin
       writeln('size of '+itoa(rets.size));
       rets.Position:= 0;
       //ALMimeBase64decodeStream(rets, rets2)
       rets.savetofile((exepath+'qrcodeimage.png'));
       openfile(exepath+'qrcodeimage.png');
     end  
      else result:='Failed:'+
             itoa(Httpq.getresponsecode)+Httpq.GetResponseHeader('message');
  except  
    writeln('EWI_HTTP: '+ExceptiontoString(exceptiontype,exceptionparam));
  finally
    httpq:= Nil;
    heads.Free;
    rets.Free;
  end;                  
end;                 //}

The heads.add(‘Accept= image/png’); is important, if you don’t set this request you have to put the stement ALMimeBase64decodeStream(rets, rets2) woth a second memorystream to get image format you want.

Sample Request with Live Demo – http://www.softwareschule.ch/examples/apilayer2.txt
AGSI aPI

Multicode Programming

maXbox Starter 119 — Get a different step Solution.

origin from: https://maxkleiner1.medium.com/multicode-programming-4e476ebc6004

“Code behaves like recording — recoding from brain to tool.

Source: 393_QRCode5_64TXT_211_7_tutor119_2.TXT

softwareschule.ch/examples/qrcode8.txt

Sometimes life is like playing with us: You write some code that solves a problem, and then someone comes along and makes the problem harder. Here’s how to continuously integrate new solutions without having to rewrite your old solutions (as much).

Let’s start with a simple problem: You’ve written and testing a class that returns a QR-Code from a text line:

procedure QRCcode(apath:string; atext: string);
var aQRCode: TDelphiZXingQRCode; QRCodBmp: TBitmap; Jpg: TJpegImage;
Row,Column,err: Integer; res: Boolean; //HRESULT;
var form1: TForm;
begin
aQRCode:= TDelphiZXingQRCode.Create;
QRCodBmp:= TBitmap.Create;
form1:= getform2(700,500,123,'QR Draw PaintPerformPlatform PPP5');
try
aQRCode.Data:= atext;
aQRCode.Encoding:= qrcAuto; //TQRCodeEncoding(cmbEncoding.ItemIndex);
//aQRCode.QuietZone := StrToIntDef(edtQuietZone.Text, 4);
QRCodBmp.SetSize(aQRCode.Rows, aQRCode.Columns);
for Row:= 0 to aQRCode.Rows- 1 do
for Column:= 0 to aQRCode.Columns- 1 do begin
if (aQRCode.IsBlack[Row, Column]) then
QRCodBmp.Canvas.Pixels[Column,Row]:= clBlack
else
QRCodBmp.Canvas.Pixels[Column,Row]:= clWhite;
end;

aQRCode:= TDelphiZXingQRCode.Create;
QRCodBmp:= TBitmap.Create;
form1:= getform2(700,500,123,’QR Draw PaintPerformPlatform PPP5′);
try
aQRCode.Data:= aTextline;
aQRCode.Encoding:= qrcAuto; //TQRCodeEncoding(cmbEncoding.ItemIndex);

That class is currently used by a single application in a scripting environment. Inside the class you might have Delphi.VCL Java.FX or ADO.NET code or LINQ/Entity Framework code, but either way, you’re accessing an internal component on your operating system on your desktops hard-disk.

Then you realize that, because your component data/algorithm doesn’t change very often, as long as you don’t update or rebuild your component, so your are not up to date with newer features like encoding or error correcting level.

This leads to a second solution of an external service call like Google Chart Tools. Using the Google Chart Tools / Image Charts (aka Chart API) you can easily generate QR-codes, this kind of images are a special type of two-dimensional barcodes. They are also known as hard-links or physical world hyperlinks.

The Google Chart Tools also let you generate QR-code images using an HTTP POST or short messages. All do you need to generate a Qr-Code is make a get request to this URI:

http://chart.apis.google.com/chart?chs=200×200&cht=qr&chld=M&chl=Text

So we parametrize this URI1 for a get request call:

Const
URLGoogleQRCODE=’https://chart.apis.google.com/chart? chs=%dx%d&cht=qr&chld=%s&chl=%s’;

The API requires 4 simple fields be posted to it:

cht=qr this tells Google to create a QR code;
chld=M the error correction level of the QR code (more later);
chs=wxh the width and height of the image to return (e.g. chs=250×250);
chl=text the URL encoded text to be inserted into the qr-code.

As the URL is https with a certificate, an application can identify himself and authenticate himself to any organization trusting the third party.

The second thing to consider is that (I assume) if the web service object is working right we can compare the resulting picture with the first solution. So lets make the call with WinInet Win-API:

const   QDATA= 'https://maxbox4.wordpress.com/';

Type TQrImage_ErrCorrLevel=(L,M,Q,H);

//1. WinInet -Win API
procedure GetQrCodeInet(Width,Height:Word; C_Level,apath:string;const Data:string);
var encodURL: string;
pngStream: TMemoryStream;
begin
encodURL:= Format(URLGoogleQRCODE,[Width,Height, C_Level, HTTPEncode(Data)]);
pngStream:= TMemoryStream.create;
HttpGet(encodURL, pngStream); //WinInet
{with TLinearBitmap.Create do try
//LoadFromStream2(pngStream,'PNG');
//SaveToFile(apath); }
try
pngStream.Position:= 0;
pngStream.savetofile(apath);
sleep(500);
OpenDoc(apath);
finally
//Dispose;
pngStream.Free;
end;
end;

And the call with wininet:

GetQrCodeInet(150,150,’Q’,ExePath+’examples\’+AFILENAME,QData);
sleep(500);
writeln(‘SHA1 ‘+sha1(ExePath+AFILENAME)); //}

//SHA1 FE526D46BA48DFD820276872C969473A7B7DE91C

You should be able to see the content of the file with OpenDoc(apath); and we pass AFILENAME= ‘mX5QRCode5.png’;

https://maxbox4.wordpress.com/

So what’s the meaning of the SHA1 hash? For this we compare with a third solution of internet-call with the Indy10 framework. In comparison with Wininet as the internal WinAPI library, Indy is an external library also based on OpenSSL and we compare the result to get the same hash:

procedure GetQrCodeIndy(Width,Height: Word; C_Level,apath: string;
const Data: string);
var encodURL: string; end;

//2. Indy 10 Socks Lib -GoogleAPI
procedure GetQrCodeIndy(Width,Height: Word; C_Level,apath: string; const Data: string);
var encodURL: string;
idhttp: TIdHttp;// THTTPSend;
pngStream: TMemoryStream;
begin
encodURL:= Format(URLGoogleQRCODE,[Width,Height,C_Level, HTTPEncode(Data)]);
idHTTP:= TIdHTTP.Create(NIL)
pngStream:= TMemoryStream.create;
idHTTP.Get1(encodURL, pngStream)
//writeln(idHTTP.Get(encodURL));
//Exception: if not Dll-Could not load SSL library. at 827.447
try
pngStream.Position:= 0;
writeln(itoa(pngStream.size));
pngStream.savetofile(apath);
sleep(500);
OpenDoc(apath);
finally
idHTTP.Free
idHTTP:= Nil;
pngStream.Free;
end;
end;

As I said we need two DLLs to support the OpenSSL lib; provided OpenSSL is installed in your system. The call arguments are the same so we get the same hash back:

GetQrCodeIndy(150,150,’Q’,ExePath+’examples\’+AFILENAME,QData);
sleep(500);
writeln(‘SHA1 ‘+sha1(ExePath+AFILENAME)); //}

intern: FE526D46BA48DFD820276872C969473A7B7DE91C
SHA1 FE526D46BA48DFD820276872C969473A7B7DE91C

openssl pkcs12 -info -in filename.p12

In cryptography, PKCS #12 defines an archive file format for storing many cryptography objects as a single file. It is commonly used to bundle a private key with its X.509 certificate or to bundle all the members of a chain of trust.

multicode in hybridcode

pic1: tutor119_signobjectscreen_6.png

You can either sign files out of a working directory, or you can place them in your Windows SDK\bin folder or key store.

Source Organisation for Multicode

This separation of now three solutions is reflected in a number of ways. The most important distinction is that the code schema for developers in the script division has now 3 folding sections and can be different from a current configuration design:

1. Solution 1 with the internal class needs no https and component.

2. Solution 2 is dependent from external Google API and based on internal OS WinInet.

3. Solution 3 is also dependent on Google but has its own internet
suite as Indy 10 but dependent on OpenSSL.

This we should consider and document in our source code repository:

BDS 22.0

pic2: tutor119_catrepo.png

The interesting point is to know where the code is running and how it is stored in the executable or script itself. Embedding Wininet as one function is nice: HttpGet(encodURL, pngStream); //WinInet, but you don’t have the flexibility to change for example request- or response-headers of the external web service you consume, so we test a forth solution in detail also to debug with more verbosity:

//4. Internal Class mXLib5 -GoogleAPI

procedure GetQrCodeWininetClass(Wid,Hei:Word; C_Level,apath:string; const Data:string);
var httpR: THttpConnectionWinInet;

procedure GetQrCodeWininetClass(Wid,Hei:Word; C_Level,apath:string; const Data:string);
var httpR: THttpConnectionWinInet;
ms: TMemoryStream;
heads: TStrings; iht:IHttpConnection; //losthost: THTTPConnectionLostEvent;
begin
httpr:= THttpConnectionWinInet.Create(true);
ms:= TMemoryStream.create;
try
//iht:= httpRq.setHeaders(heads);
httpR.Get(Format(URLGoogleQRCODE,[Wid,Hei,C_Level,HTTPEncode(Data)]),ms);
//httpRq.setRequestheader('x-key',aAPIkey);
if httpr.getresponsecode=200 Then begin
ms.Position:= 0;
ms.savetofile(apath);
sleep(500);
OpenDoc(apath)
end Else writeln('Failed responsecode:'+
itoa(HttpR.getresponsecode));
except
//writeln('Error: '+HttpRq.GetResponseHeader(''));
writeln('EHTTP: '+ExceptiontoString(exceptiontype, exceptionparam));
finally
httpr:= Nil;
ms.Free;
end;
end;

Therefore we can or must parametrize the handler procedure, in order to have a possibility of indicating the type of request.

This is the good old Wininet API but this time as an object oriented class with methods and attributes, for example to check the response code of the get request. Also in this mapped import library we get the same hash: intern: FE526D46BA48DFD820276872C969473A7B7DE91C

My goal in refactoring or recoding is to supply “enough engineering” to support the current problem without over-engineering a solution to some later problem that might never exist. Said it another way: A solution is never more complicated than the problem it’s solving. But you know it can be the other way round for example in cryptography: Simple problem but complex solution.

The fifth solution is tricky and only in an interpreted script possible, we call the second solution pre-compiled as one function.

5 Solutions Overview

  • Internal QR-Component Class mXLib5 TDelphiZXingQRCode
  • External call of script with procedure WinInet & Google API
  • External call of script with Indy class & Google API
  • Internal call of THttpConnectionWinInet class of external API
  • Internal call direct in script (precompiled):
    GetQrCode5(150,150,’Q’,QDATA, ExePath+AFILENAME);

The Mystery of solution 6 and 7

To be really independent from internal calls and run just on runtime a late binding solution can be considered.

Early (or static) binding refers to compile time binding as before and late (or dynamic) binding refers to runtime binding (for example when you use reflection or retyping).

Late binding uses CreateObject to create and instance of the application object, which you can then control. For example, to create a new instance of WinHttp.WinHttpRequest.5.1 using late binding in our sixth solution:

function QRCcodeOle(Wid,Hei:Word; C_Level,apath:string;
const Data:string): string;
var httpReq,hr: Olevariant; instream: IStream;

function QRCcodeOle(Wid,Hei:Word; C_Level,apath:string; const Data:string): string;
var httpReq,hr: Olevariant; instream: IStream;
jo: TJSON; strm :TMemoryStream;
begin
httpReq:= CreateOleObject('WinHttp.WinHttpRequest.5.1');
//jo:= TJSON.Create();
hr:= httpReq.Open('GET',
format(URLGoogleQRCODE,[Wid,Hei,C_Level,HTTPEncode(Data)]))
httpReq.setRequestheader('content-type','application/octet-stream');
//httpReq.setRequestheader('Authorization','Bearer '+ CHATGPT_APIKEY2);
if hr= S_OK then
HttpReq.Send();
strm:= TMemoryStream.create;
If HttpReq.Status = 200 Then begin
try
//https://stackoverflow.com/questions/4938601/getting-an-istream-from-an-olevariant
strm:= getmemStreamfromIStream2(HttpReq.responsestream);
//getmemStreamfromIStream2file(hrstream, apath);
writeln('responsestream size: '+itoa(strm.size));
strm.savetoFile(apath)
openFile(apath);
except
writeln('EHTTPex: '+ExceptiontoString(exceptiontype, exceptionparam));
finally
strm.free;
httpreq:= unassigned;
end;
end;
end;

This solution is the load of an IStream from an OLE response stream as unknown variant type to a well known TMemoryStream in order to save the response stream to a file (in our example the binary QR-code image file as a *.png graphic image).

REgEx as multicode

Pic3: tutor119_regex_multicod.png

The Crux is the getmemStreamfromIStream2 function. I was probably not aware of TOleStream at the time I wrote this answer. Looking at TOleStream now, I notice that it does not support 64-bit streams. This code does. Other than that, this code is almost identical to the code that TOleStream uses, with one only exception being that this code’s implementation of the Size property getter is more optimized than TOleStream‘s implementation is, and this code implements the size property setter whereas TOleStream does not. So we can combine the invoke call from HttpReq.responsestream to get a file in one function:

function getmemStreamfromIStream2File(avariant: variant;
apath: string): Tmemorystream;
var instream: IStream; ostream: TStream;

begin
instream:= IUnknown(avariant) as IStream;
ostream:= TOleStream.Create(instream);
result:= Tmemorystream.Create;
try
result.CopyFrom(OStream, OStream.Size);
result.SaveToFile(apath)
finally
OStream.Free;
end;
end;

And the last one as solution 7 catches everything from external even the language and library, its a Python for Delphi Solution (P4D):

procedure PYdigitQRCode;

procedure PYdigitQRCode;
begin
with TPythonEngine.Create(Nil) do begin
pythonhome:= 'C:\Users\user\AppData\Local\Programs\Python\Python312\';
//SetPythonHome; //pythonpath
OnPathInitialization:= @TPathInitialization;
try
loadDLL;
//opendll(PYDLL64)
execstr('import sys, os, json, qrcode');
println(EvalStr('sys.version'));
println(EvalStr('sys.executable'));
execstr('qr=qrcode.QRCode(error_correction=qrcode.constants.ERROR_CORRECT_Q)');
execstr('qrcode.make("'+QDATA+'").save(".\examples\'+AFILENAME+'")');
// println('is posix '+EvalStr('lib_platform.is_platform_posix'));
except
raiseError;
finally
unloadDll;
free;
end;
end; //with
end;

>>>3.12.1 (tags/v3.12.1:2305ca5, Dec 7 2023, 22:03:25) [MSC v.1937 64 bit (AMD64)]

As a standard install uses pypng to generate PNG files and can also render QR codes directly to the console. A standard install is just:

pip install qrcode;

Just for the record, found another solution as solution eight with an Aduino Board to send the text and get an bitmap back (in this context below only as a serial monitor graph):

const char* QDATA= "https://maxbox4.wordpress.com/";

void setup() {
Serial.begin(115200);
// Start time
uint32_t dt = millis();

// Create the QR code
QRCode qrcode;
uint8_t qrcodeData[qrcode_getBufferSize(3)];
qrcode_initText(&qrcode, qrcodeData, 3, 0, QDATA);

// Delta time
dt = millis() - dt;
Serial.print("QR Code Generation Time: ");
Serial.print(dt);
Serial.print("\n");

// Top quiet zone
Serial.print("\n\n\n\n");

for (uint8_t y = 0; y < qrcode.size; y++) {
// Left quiet zone
Serial.print(" ");
// Each horizontal module
for (uint8_t x = 0; x < qrcode.size; x++) {

// Print each module (UTF-8 \u2588 is a solid block)
Serial.print(qrcode_getModule(&qrcode, x, y) ? "\u2588\u2588":" ");
}
Serial.print("\n");
}
// Bottom quiet zone
Serial.print("\n\n\n\n");
}

void loop() {
}
QRCode Generator on a Microprocessor Aruino Uno
Output to Serial Monitor with each module (UTF-8 \u2588 is a solid block)!
Connect to Arduino UNO with Async Pro

Conclusion

When it comes to problem-solving, there are often multiple solutions that can be used to solve the same problem. The choice of solution depends on various factors such as performance, storage, correctness, implement-ation, simplicity, and also scaleability and security.

The Google Chart Tools (Chart API) also let you generate QR-code images using an HTTP POST call. A Quick Response code is a two-dimensional pictographic code used for its fast readability and comparatively large storage capacity.

Early binding refers to assignment of values to variables during design time whereas late binding refers to assignment of values to variables during run time as a concept of multicode programming. Implemented often using [special] dynamic types, introspection /reflection, flags and compiler options, or through virtual methods by borrowing and extending dynamic dispatching.

Script: softwareschule.ch/examples/qrcode8.txt

softwareschule.ch/examples/qrcode8.htm

Report as PDF: Multicoding Tutor (softwareschule.ch)

References:

Compiled Project:

https://github.com/maxkleiner/maXbox4/releases/download/V4.2.4.80/maxbox5.zip

Free Automated Malware Analysis Service — powered by Falcon Sandbox (hybrid-analysis.com)

Topic:

https://stackoverflow.com/questions/4938601/getting-an-istream-from-an-olevariant

Preparation:

openWeb(‘https://qreader.online/’);

The Mystery of IStream — Code Blog

Doc and Tool: https://maxbox4.wordpress.com

Max Kleiner 23/02/2024

1Uniform Resource Identifier

Multicode Locomotion as 4 Solutions for 1 Problem
Lastline locs SBB 11253, DB 103 245, SNCF 6574
Lastline locs SBB 11253, DB 103 245, SNCF 6574 Secondo
Lastline locs SNCF 6574 SBB 11253, DB 103 245, FS E 444 Secondo
TEE 5 Nations Locs SNCF-DB-SBB-FS-ÖBB
Fleischmann 1337 P: 1959
Trix Express EMS 218 005-7
Cisalpin

ODBC Access 64-bit

If you want to access a ODBC data link to 64-bit you will see only SQL-Server in ODBC administrator 64bit on Win 11.

dual drivers on ODBC Admin

So you have to install 64bit drivers first :

To download the ODBC driver for ACCESS that allows you to open .accdb databases for Windows 11, you can follow these steps:

1. Visit the Microsoft Download Center https://support.microsoft.com/en-us/office/download-and-install-microsoft-365-access-runtime-185c5a32-8ba9-491e-ac76-91cbe3ea09c9

2. Look for the Microsoft Access Database Engine 2016 Redistributable.

3. Download the version that matches your system (32-bit or 64-bit). 4. After downloading, run the installer to install the ODBC driver.

from the restributable 2016

Then you can build the data source name in maXbox5 go to /Options/ADO SQL Workbench with the connection string as the result below :

Provider=MSDASQL;Persist Security Info=False;Extended Properties=”DSN=mX5detailbase64;
DBQ=C:\maxbox\maxbox4\maxbox4\examples\detail.mdb;DriverId=25;FIL=MS Access;MaxBufferSize=2048;PageTimeout=5;UID=admin;”

Data Source Name Builder

Then you test the connection and open Table1 in the viewer:

64 bit in view for modify

Microsoft ODBC Driver for Access and SQL Server is a crucial component for developing native-code applications that connect to SQL Server and Azure SQL Database. It provides runtime support for applications using native code APIs to establish connections with SQL Server.

Cité du Train

Game of Life Sequence

Arnold N
Fleischmann BR 103 242

XN Resource Editor

Based on the latest publicly available sources of Colin Wilson’s XN Resource Editor.

(venv) C:\maxbox\bfh2023\bachmann_bfh.masthesis-1.0.0\bfh.masthesis-1.0.0>pytest -v
================================================= test session starts =================================================
platform win32 — Python 3.12.1, pytest-8.0.2, pluggy-1.4.0 — C:\maxbox\bfh2023\bachmann_bfh.masthesis-1.0.0\bfh.masthesis-1.0.0\venv\Scripts\python.exe
cachedir: .pytest_cache
rootdir: C:\maxbox\bfh2023\bachmann_bfh.masthesis-1.0.0\bfh.masthesis-1.0.0
plugins: cov-4.1.0
collected 31 items

tests/test_topology.py::test_parse_topology_xml PASSED [ 3%]
tests/test_topology.py::test_build_graph_from_dict PASSED [ 6%]
tests/test_topology.py::test_interpret_topo_elements PASSED [ 9%]
tests/test_topology.py::test_get_element___correct_existing_element PASSED [ 12%]
tests/test_topology.py::test_get_element___wrong_existing_element PASSED [ 16%]
tests/test_topology.py::test_get_element___non_existing_element PASSED [ 19%]
tests/test_topology.py::test_get_min_bkm_element___default_not_drawn PASSED [ 22%]
tests/test_topology.py::test_get_min_bkm_element___drawn_element PASSED [ 25%]
tests/test_topology.py::test_get_smaller_vp___vp1 PASSED [ 29%]
tests/test_topology.py::test_get_smaller_vp___vp2 PASSED [ 32%]
tests/test_topology.py::test_get_bigger_vp___vp1 PASSED [ 35%]
tests/test_topology.py::test_get_bigger_vp___vp2 PASSED [ 38%]
tests/test_topology.py::test_get_other_vp___vp1 PASSED [ 41%]
tests/test_topology.py::test_get_other_vp___vp2 PASSED [ 45%]
tests/test_topology.py::test_get_station PASSED [ 48%]
tests/test_topology.py::test_get_element_type PASSED [ 51%]
tests/test_topology.py::test_get_element_detail___G PASSED [ 54%]
tests/test_topology.py::test_get_element_detail___K PASSED [ 58%]
tests/test_topology.py::test_get_element_detail___W PASSED [ 61%]
tests/test_topology.py::test_get_neighbour_and_vp1 PASSED [ 64%]
tests/test_topology.py::test_get_verzweigung_vp___vp1 PASSED [ 67%]
tests/test_topology.py::test_get_verzweigung_vp___vp2 PASSED [ 70%]
tests/test_topology.py::test_set_drawn_y___vp1 PASSED [ 74%]
tests/test_topology.py::test_set_drawn_y___vp2 PASSED [ 77%]
tests/test_topology.py::test_set_connected___vp1 PASSED [ 80%]
tests/test_topology.py::test_set_drawn PASSED [ 83%]
tests/test_topology.py::test_are_all_elements_drawn PASSED [ 87%]
tests/test_topology.py::test_get_path_from_element_list___path1 PASSED [ 90%]
tests/test_topology.py::test_get_path_from_element_list___path2 PASSED [ 93%]
tests/test_topology.py::test_get_gleis_cp_map PASSED [ 96%]
tests/test_topology.py::test_test_get_cp_gleis_map PASSED [100%]

================================================= 31 passed in 0.51s ==================================================

(venv) C:\maxbox\bfh2023\bachmann_bfh.masthesis-1.0.0\bfh.masthesis-1.0.0>

SimTool
SimTool Next Level

Delphi Day at BFH

20Delphi-User-Treffen der Delphi Gruppe Schweiz. Presenters: Max Kleiner, Christoph Schneider, u.a. at 16/03/2024

CC 6574 Dole
LS Models CC 6510 Carcassonne
22 Permutations

Binary Clock Rock

In our clock we are going to represent each of the six digits used for hours, minutes and seconds as a binary number.  Since each number represents a single decimal digit, we will need 4 binary digits for each.  The maximum number we can represent with three binary digits (bits) is 111 which is 4+2+1=7 and we will also need to show bit representation of 8 and 9, thus the need for four.  

To match the hardware version of the clock, I represent the numbers vertically with the  top LED representing “8” .  So, using  “light off” for 0 and  “light on”  for 1, the clock above  represents 09:50:34.

Binary Values Decimal value of row
010000“8”
001001“4”
000010“2”
011010“1”
HHMMSS 
095034Decimal «– time 
Bin System Table

The clock starts with the current system clock time and is updated every second.    You may left click anywhere on the clock to start and stop it.   Right mouse clicks will bring up a configuration panel with a number of options.  You can change the background, LED, and label  colors,  set 12 or 24 hr. time formats, hide or display labels, etc. 

{************** FormResize **************}
procedure TForm1FormResize(Sender: TObject);
{Recalculate clock display parameters when form size changes}
var rx,ry:integer;
begin
  cellwidth:=form1.clientwidth div 7;
  cellheight:=form1.clientheight div 5;
  offsetx:=cellwidth div 2;
  offsety:=cellheight div 2;
  rx:=form1.clientwidth div 50;
  ry:=form1.clientheight div 50;
  if rx>ry then radius:=ry else radius:=rx;
  form1.font.height:=cellheight div 2;
  LW:=form1.canvas.textWidth('H') div 2;
  LH:=form1.canvas.textheight('H') div 2;
end;

In this case, it is adequate just to call FormPaint from UpdateTime.  We will be waiting for the next timer pop most of the time anyway, so there is plenty of time for windows to update the screen.   Just for fun, I tried several other techniques and left them as comments in the code.   The only noticeable difference is a slight flicker when Invalidate is used.  I suppose because it erases the background before repainting the screen. 

Test Compare Script and EXE

There is some confusion over the best way to update a form’s canvas.  We can choose from calling FormPaint procedure directly, InvalidateUpdateRepaint.

http://delphiforfun.org/Programs/binary_clock.htm

procedure fmakebinary(val:integer; col:byte; var bin:TALed);
//array of array of boolean
 var n:integer;
  begin
    for n:= 3 downto 0 do begin
      bin[col][n]:= val mod 2 =1;
      val:= val div 2;
    end;
  end;
Design a site like this with WordPress.com
Get started