Blog

  • DzMiniTable

    DzMiniTable

    Delphi non-visual component to handle small dynamic table stored as plain text

    Delphi Supported Versions Platforms Auto Install VCL and FMX

    ko-fi

    Please, checkout my new component DzXMLTable, it’s a new concept of this component, storing data in XML format.

    What’s New

    • 09/12/2021 (Version 1.8)

      • Delphi 11 auto-install support.
    Click here to view the entire changelog
    • 03/13/2021 (Version 1.7)

      • Removed CompInstall.exe from component sources due to AV false positive warning (now you can get it directly from CompInstall repository).
    • 02/01/2021 (Version 1.6)

      • Removed Delphi XE2 from the list of environments as it was never possible to compile in this version.
    • 12/18/2020 (Version 1.5)

      • Updated Component Installer app (Fixed call to rsvars.bat when Delphi is installed in a path containing spaces characters).
    • 10/31/2020 (Version 1.4)

      • Included Delphi 10.4 auto-install support.
    • 10/27/2020 (Version 1.3)

      • Fixed previous Delphi versions (at least on XE2, XE3, XE4 and XE5) package tag. It was causing package compilation error.
    • 10/26/2020 (Version 1.2)

      • Updated CompInstall to version 2.0 (now supports GitHub auto-update)
    • 10/09/2020 (Version 1.1)

      • New methods to search data
    • 05/03/2020

      • Updated CompInstall to version 1.2
    • 02/11/2019

      • Include auto install app
    • 02/08/2019

      • Component renamed. Please full uninstall previous version before install this version. ⚠️
    • 02/07/2019

      • Add Win64 support (library folders changed!) ⚠️

    Component Description

    When you are working on your software project, you always need to store some data into a INI file or some text file, as a configuration file or other information.

    So, the options you have is INI file, or plain text. And almost always you need a table with some fields.

    In a plain text, you can use one record per line, and separate fields using tab character, or pipe character, or another one. But you have some problems with this method: you need to take care about the separator char, not using it at fields value; and you have a biggest problem: in a future version, if you need to add a column, you lose the compatibility at this file when there are already data stored.

    If you are working with INI file, you can specify the field names, but even that, you have problems to store one record per section, and is difficult to reorder records, delete records and name the record.

    But don’t worry, here is the solution.

    The MiniTable is a non-visual component where you can store records with fields and values, and you can name the field, so you don’t need to worry at future versions. You can add new fields at any time, just reading and writing them.

    Installing

    Auto install

    1. Download Component Installer from: https://github.com/digao-dalpiaz/CompInstall/releases/latest
    2. Put CompInstall.exe into the component repository sources folder.
    3. Close Delphi IDE and run CompInstall.exe app.

    Manual install

    1. Open DzMiniTable package in Delphi.
    2. Ensure Win32 Platform and Release config are selected.
    3. Then Build and Install.
    4. If you want to use Win64 platform, select this platform and Build again.
    5. Add sub-path Win32\Release to the Library paths at Tools\Options using 32-bit option, and if you have compiled to 64 bit platform, add sub-path Win64\Release using 64-bit option.

    Supports Delphi XE3..Delphi 11

    Published Properties

    AutoSave: Boolean = Enables auto save to specified FileName at any method that writes any change to the table

    FileName: String = Specifies the full file name to Open and Save the table

    JumpOpen: Boolean = When this property is enabled, if the file does not exist at Open method, the table will be loaded empty without raise any exception.

    Public Properties

    Lines: TStringList = Allows you to change the stored table manually. You should never change this TStringList.

    MemString: String = Allows you to load the table directly from a string, and store the table to a string. This is useful when you are storing the table in a database blob file.

    SelIndex: Integer = Returns the current selected index (read-only property)

    Count: Integer = Returns the record count of the table (read-only property)

    F[FieldName: String]: Variant = Read/write a field value at current selected record. The FieldName is case-insensitive. If you are reading field value and the field does not exist, the result is an empty string.

    Procedures/Functions

    procedure SelReset;

    Resets the selection of record to none. You can use this method to initialize an iteration of record, ensuring the selected record is reseted.

    function InRecord: Boolean;

    Returns true if there is a record selected

    procedure Open;

    Load the table from file specified at FileName property

    procedure Save;

    Save the table to file specified at FileName property

    procedure EmptyTable;

    Clear all data in the table

    procedure EmptyRecord;

    Clear all data in the current selected record

    function IsEmpty: Boolean;

    Returns true if the table is empty

    procedure Select(Index: Integer);

    Select the record by index position. When you select a record, all its fields stays stored at internal memory, so you can read and write the fields value using F property.

    procedure First;

    Select the first record in the table

    procedure Last;

    Select the last record in the table

    function Next: Boolean;

    Select the next record in the table, based in the current index position. This method is useful to iterate all record. See example below:

    DzMiniTable.SelReset;
    while MiniTable.Next do
    begin
      ListBox.Add(DzMiniTable.F['Name']+' / '+MiniTable.F['Phone']);
    end;
    procedure New;

    Create a new record at the end of the table position and select it, so you can immediately start write fields.

    procedure Insert(Index: Integer);

    Insert a new record at the index position and select it, so you can immediately start write fields.

    procedure Post;

    Writes all change in the current record to the table. You don’t need to start editing of the record. See example below:

    DzMiniTable.New;
    DzMiniTable.F['Name'] := 'John';
    DzMiniTable.F['Phone'] := '1111-2222';
    DzMiniTable.Post;

    or:

    DzMiniTable.Select(3);
    DzMiniTable.F['Phone'] := '1111-2222';
    DzMiniTable.Post;
    procedure Delete;

    Delete the current selected record

    procedure MoveDown;

    Move the current record to one index down

    procedure MoveUp;

    Move the current record to one index up

    function FindIndex(const FieldName: string; const Value: Variant): Integer;

    Find any field value on all records, returning record index position.

    function Locate(const FieldName: string; const Value: Variant): Boolean;

    Find any field value on all records, returning true if record found, and positioning it as current record. If no record is found, the current position will not be changed.

    function ContainsValue(const FieldName: string; const Value: Variant): Boolean;

    Find any field value on all records, returning true if record found.

    function FieldExists(const FieldName: String): Boolean;

    Returns true if the FieldName exists at current selected record.

    function ReadDef(const FieldName: String; const Default: Variant): Variant;

    This functions is the same as the F property, but here you can specify a default value when the Field does not exist in the record.

    Visit original content creator repository https://github.com/digao-dalpiaz/DzMiniTable
  • hass-sidecar

    CodeFactor

    Home assistant sidecar

    Description

    This is an app written in Typescript and NodeJS to interact with your Home Assistant installation. It manages the websocket and the MQTT connections with your host(s), and you can write your own automations in typescript in a simple way.

    All automations are hotloaded, so you can create, modify and delete automations and they will be unloaded and loaded on the fly, without reboot the app.

    Why

    Home Assistant is a great platform to manage our smart devices, but for me the automations system is not entirely powerful, and on many occasions I need to do more complex things that I cannot do with a simple yaml or with the web interface. I love typescript and NodeJS and it is faster for me to write automations in this environment.

    Getting started

    For detailed information about clases and methods, go to docs/globals.md

    Installation

    > NodeJS version

    # Clone the repo
    git clone git@github.com:d4nicoder/hass-sidecar
    # Change directory
    cd hass-sidecar
    # Install dependencies
    npm ci

    > Docker run version

    docker run \
      -e HA_HOST=<your-host> \
      -e HA_TOKEN=<token> \
      -e MQTT_URI=<mqtt-uri> \
      -v <path/to/your/automations/folder>:/opt/app/src/automations \
      --restart=unless-stopped \
      danitetus/hass-sidecar:latest

    > Docker compose version

    version: '3'
    services:
      hass-sidecar:
        image: danitetus/hass-sidecar:latest
        environment:
          HA_HOST: <home-assistant-host>
          HA_TOKEN: <home-assistant-token>
          MQTT_URI: <mqtt-uri>
        volumes:
          - <path/to/your/automations/folder>:/opt/app/src/automations
        restart: unless-stopped

    NodeJS Dependencies

    You can install dependencies directly with npm (on native installations) or in the docker version setting up an environment variable called DEPENDENCIES.

    NodeJS

    npm install dependency1 dependency2 ...

    Docker

    docker run \
      -e HA_HOST=<your-host> \
      -e HA_TOKEN=<token> \
      -e MQTT_URI=<mqtt-uri> \
      -e DEPENDENCIES= dependency1 dependency2 ...
      -v <path/to/your/automations/folder>:/opt/app/src/automations \
      --restart=unless-stopped \
      danitetus/hass-sidecar:latest

    __

    Setup

    The best way to setup is to create a .env file in the projects root folder. You have tu set these variables:

    HA_HOST: http://your-host-or-ip:8123
    HA_TOKEN: <token-provided-from-home-assistant>
    MQTT_URI: mqtt://user:pass@server:port

    Start

    npm start

    Creating automations

    All automations have to be stored in ./src/automations (you can organize them in subfolders). They should extend Automation class.

    Let’s create an automation example. We are going to turn on and off a light when occupancy sensor changes:

    /*
      ./src/automations/presenceLight.ts
    */
    
    import { Automation } from '../interfaces/Automation.ts'
    
    module.exports = class MyAutomation extends Automation {
      private lightEntity = '<entity_id>'
      private sensorEntity = '<entity_id>'
    
      constructor() {
        super('Title of my automation', 'Description') // Title and description are optional
    
        this.onStateChange(this.sensorEntity, (newState, oldState) => {
          if (newState.state === 'on') {
            this.callService('light', 'turn_on', this.lightEntity, {
              // Attributes are optional
              transition: 3
            })
          } else if (newState.state === 'off') {
            this.callService('light', 'turn_off', this.lightEntity, {
              // Attributes are optional
              transition: 3
            })
          }
        })
      }
    }

    Timeouts, intervals and runAt

    To be able to create tasks in intervals or timeouts or simply execute a function at a certain time. You must use the methods of the Automation class dedicated to this. Do not use the NodeJS commands (setInterval, setTimeout, clearInterva, clearTimeout) because if your class has to be reloaded (due to a modification or because it has been removed), those callbacks will not be able to be removed, which will cause them to be executed anyway or the application fail.

    Let’s see an example of this:

    /*
      ./src/automations/crazyLights.ts
    */
    
    /**
     * Let's create an automation to do:
     *  - toggle a light every 10 seconds
     *  - say something in one speaker after 10 minutes
     *  - clear the interval when 60 seconds have passed
    */
    
    import { Automation } from '../interfaces/Automation.ts'
    import moment from 'moment'
    
    module.exports = class CrazyLights extends Automation {
      
      // Define our private properties
      private lightEntity = '<entity_id>'
      private speakerEntity = '<entity_id>'
      private intervalId: NodeJS.Timeout
    
      /**
       * Instantiate
       */
      constructor () {
        // Good practice to define title and description (for the logs)
        super('Crazy lights', 'Lights going crazy')
    
        // Define toggle action on 10 seconds interval
        this.intervalId = this.setInterval(() => {
          this.callService('light', 'toggle', this.lightEntity)
        }, 10000)
    
        // Let's delete the interval after 60 seconds
        this.setTimeout(() => {
          this.clearTimeout(this.intervalId)
        }, 60000)
    
        // After 10 minutes, say something on the speaker
        this.runAt(moment().add(10, 'minutes').toDate(), () => {
          this.callService('tts', 'google_cloud_say', this.speakerEntity, {
            message: 'This light is crazy'
          })
        })
      }
    }

    Create your own libraries

    You can create your own libraries to use in your automations. They have to be placed inside a “lib” folder. This is mandatory because these folders are ignored to load as automations.

    Let’s see an example:

    /*
      ./src/automations/lib/sayHello.ts
    */
    
    export sayHello = (name: string): string => {
      return `Hi ${name}!`
    }

    /*
      ./src/automations/sayHi.ts
    */
    
    import { Automation } from '../interfaces/Automation.ts'
    import { sayHello } from './lib/sayHello.ts'
    
    module.exports = class SayHi extends Automation {
      constructor() {
        super('Say hello', 'Just for education')
    
        sayHello('Daniel')
      }
    }

    About logging

    You can use for loggin the class provided for it. Why? Because it allows you to trace the time of every log and define color for each line based on his type (debug, info, error, log, warning…). Of course you can still use the typical console.[log|error|info…], but it will be horrible for your eyes xD.

    Visit original content creator repository
    https://github.com/d4nicoder/hass-sidecar

  • kinvey-starter-ionic2

    kinvey-starter-ionic2-RC – Updated to work with latest release candidate

    This project is being migrated to manage state using ngrx/store & ngrx/effects

    Tested With Ionic Version

    Cordova CLI: 6.3.0
    Ionic Framework Version: 2.0.0-rc.4
    Ionic CLI Version: 2.1.17
    Ionic App Lib Version: 2.1.7
    Ionic App Scripts Version: 0.0.47
    ios-deploy version: 1.8.6
    ios-sim version: 5.0.6
    OS: macOS Sierra
    Node Version: v5.0.0
    Xcode version: Xcode 8.2.1 Build version 8C1002
    

    Checkout the repo and run npm install to download the required node modules for the project

    Required Plugins

    • Camera
    • File

    Create Account w/ Kinvey

    • We are using the Javascript API in this project not the angular module
    • When I upgraded to Ionic2 and the Kinvey JS API I did leave the original REST API code in the project if you are looking for that solution also

    Setup

    Edit the configuration file, src/providers/config.ts to contain the proper credentials from your kinvey account

    export let KINVEY_BASE_URL = "https://baas.kinvey.com/";
    export let appKey = "YOUR-APP-KEY-GOES-HERE"
    export let secretKey = "YOUR-SECRET-KEY-GOES-HERE"
    export let KINVEY_AUTH = btoa(appKey + ':' + secretKey)

    Make sure in your kinvey console to create a collect named todo-collection and the object is structured like this

    {
      description: "Sample To Do Description"
      title: "Sample To Do"
      status: "open"
    }

    The application demonstrates the following:

    • Login and Logout
    • Create New User Account
    • View Data Collection
    • Delete Object From Data Collection – swipe list item to expose delete button (In Progress)
    • Add Object to Data Collection
    • Using Camera Plugin to capture image and then upload file
    • Add File Objects as A Collection

    Ionic 1 Version of Starter Project Can Be Found Here

    ##MORE IONIC2 SAMPLES HERE

    Visit original content creator repository
    https://github.com/aaronksaunders/kinvey-starter-ionic2

  • dp2-lab3

    Distributed Programming II A.Y. 2016-17

    Assignment n. 3 – Part B

    This archive includes:

    • README This file
    • Assignment3a.pdf The text of Assignment 3 – Part a
    • Assignment3b.pdf The text of Assignment 3 – Part b
    • build.xml The ant script for this assignment
    • tomcat-build.xml The ant script for tomcat-related targets (included by build.xml)
    • custom The location of custom files
    • doc The folder where you have to put the documentation of your design.
    • lib The location of the jar files necessary for this assignment
    • lib-src The location of library sources (to be attached to the
      corresponding lib jar files in eclipse)
    • src The location of source files
    • WebContent The folder including the files used for the deployment to Tomcat
    • war The location of the .war file of the Neo4JXML service and of the
      .war file that will be generated
    • xsd The location of your schema files

    Setting the work environment for the project

    After Tomcat installation, set the Tomcat-related properties in tomcat-build.xml.

    You can create a single eclipse java project for this package and then add the
    jars under lib to the build path. It is also suggested to attach the sources
    available in lib-src to the corresponding jar library.
    DO NOT USE tomcat-build.xml directly. Instead use the build.xml (which imports it).

    Start tomcat by running the start-tomcat target
    (from build.xml).
    The other instructions for using build.xml are included in the assignment text.

    Visit original content creator repository
    https://github.com/riccardopersiani/dp2-lab3

  • Satisfactory-Dedicated-Server-ARM64-Docker

    Satisfactory Dedicated Server for ARM64 (Docker Container)

    This Docker container provides a dedicated server for running Satisfactory on ARM64 architecture. It is based on nitrog0d/palworld-arm64.


    !!! IMPORTANT (Updated for 1.1)!!!

    The server now appears to be running without crashing! The previous conveyor belt–related crash no longer occurs.
    That said, further testing is needed to fully confirm overall stability.


    Getting Started

    1. Download or Clone Repository:
      Download or clone this repository to your desired folder, for example, satisfactory-server.

    2. Set Up Permissions:
      Create a folder named satisfactory and config (your savegame and server config will be stored in there) and grant full permissions to it:

      • Using chmod:

        sudo chmod 777 satisfactory
        sudo chmod 777 config
        
      • Using chown (replace USER_ID:GROUP_ID with the desired user’s IDs, for example, 1000:1000):

        sudo chown -R USER_ID:GROUP_ID satisfactory
        sudo chown -R USER_ID:GROUP_ID config
        

        (On Oracle Cloud Infrastructure (OCI), by default, the user with the ID 1000:1000 is opc. However, since this user is primarily intended for the setup process, it is advisable to utilize the ubuntu user with IDs 1001:1001)

    3. Build the Docker Image:
      Run the build script:

      sh build.sh
      

      If execution permission is denied, grant it:

      chmod +x build.sh
      
    4. Run the Docker Image:
      After the build process completes, start the Docker image either by running:

      sh run.sh
      

      Or via Docker Compose in detached mode:

      sudo docker compose up -d
      
    5. Open Necessary Ports:
      The following ports must be opened for the server to function properly:

      • TCP: 7777, 8888
      • UDP: 7777
        Ensure these ports are open using the Linux firewall of your choice and also within the Security List of the Oracle Cloud Infrastructure Network.
    6. Default Port:
      The default port for the server is 7777.

    Now your Satisfactory Dedicated Server for ARM64 is ready!. Enjoy your gaming experience with friends.

    Modifying Server Port Configuration

    To alter the server port, you’ll need to make adjustments in the docker-compose.yml file:

    1. docker-compose.yml:
      Edit this file to expose the desired ports outside of the container and set the $EXTRA_PARAMS environment variable to configure additional parameters for the FactoryServer.sh script.

    Ensure that these changes are made accurately to reflect your desired server port configuration.

    $EXTRA_PARAMS Options

    Option Description Example
    -multihome= Bind the server process to a specific IP address rather than all available interfaces -multihome=192.168.1.4
    -ServerQueryPort= Override the Query Port the server uses. This is the port specified in the Server Manager in the client UI to establish a server connection. This can be set freely. The default port is UDP/15777. -ServerQueryPort=15000
    -BeaconPort= Override the Beacon Port the server uses. As of Update 6, this port can be set freely. The default port is UDP/15000. If this port is already in use, the server will step up to the next port until an available one is found. -BeaconPort=15001
    -Port= Override the Game Port the server uses. This is the primary port used to communicate game telemetry with the client. The default port is UDP/7777. If it is already in use, the server will step up to the next port until an available one is found. -Port=15002
    -DisablePacketRouting Startup argument for disabling the packet router (Automatically disabled with multihome) -DisablePacketRouting

    Example usage:

    EXTRA_PARAMS=-⁠ServerQueryPort=17531 -⁠BeaconPort=17532 -Port=17533
    

    Auto Update

    If you want to check for game server updates, add the following to docker-compose.yml:

    environment:
        - ALWAYS_UPDATE_ON_START=true
    

    Visit original content creator repository
    https://github.com/sa-shiro/Satisfactory-Dedicated-Server-ARM64-Docker

  • LinkedIn-SERP-scraper

    LinkedIn SERP Scraper

    LinkedIn SERP scraper fetches results of LinkedIn public profiles from Bing search engine result page and returns result in JSON format

    Setup

    1. Clone this project
    2. Install dependencies mentioned inside requirements.txt

    How to use?

    In the project directory:

    To scrap single linkedin profile:

    $ python3 script.py --linkedin_url < url > --use_browser < browser_name >

    Argument:

  • linkedin_url is the linkedin profile URL.
    For example, if user wants to scrap this URL https://in.linkedin.com/in/koushik-majumder-6172b412a
    , the command line argument will look like
    python3 script.py --linkedin_url https://in.linkedin.com/in/koushik-majumder-6172b412a
  • --use_browser is an optional argument, if not passed firefox will be used by default, other option available is Google Chrome.

  • To scrap multiple linkedin profile:

    $ python3 script.py --filename < filepath >

    Argument:

  • filename is the path of the CSV file which contains all profile URLs separated by newline. For example: file.csv is located at /usr/documents/file.csv
  • then the command line argument will look like

  • --use_browser is an optional argument, if not passed firefox will be used by default, other option available is Google Chrome.
  • python3 script.py --filename /usr/documents/file.csv --use_browser firefox

    Above code will use firefox by default and will scrap all linkedin profiles inside file.csv


    Help:

    $ python3 script.py -h


    Privacy

    This scraper only scrapes public data available to unauthenticated user and does not holds the capability to scrap anything private.

    LICENSE

    MIT

    Visit original content creator repository
    https://github.com/shaikhsajid1111/LinkedIn-SERP-scraper

  • cupy

    Visit original content creator repository
    https://github.com/kmaehashi/cupy

  • rmate-nim

    rmate

    EXPERIMENTAL

    This is a port of my rmate shell script to Nim.
    Current state is: seems working, but be careful: i am just learning Nim 🙂

    Description

    TextMate 2 adds a nice feature, where it is possible to edit files on a remote server
    using a helper script. The tool needs to be copied to the server, you want to remote
    edit files, on. After that, open your TM2 preferences and enable “Allow rmate connections”
    setting in the “Terminal” settings and adjust the setting “Access for” according to your
    needs:

    Local client connection

    It’s a good idea to allow access only for local clients. In this case you need to open
    a SSH connection to the system you want to edit a file on and specify a remote tunnel in
    addition:

    ssh -R 52698:localhost:52698 user@example.com
    

    If you are logged in on the remote system, you can now just execute

    rmate test.txt
    

    Please have a look at the sections “Remote client connection” or “SSL secured client
    connection” if ssh is not available or in environments where remote port forwarding
    could result in conflicts for example with concurrent users.

    Remote client connection

    On some machines, where port forwarding is not possible, you can allow access for
    “remote clients”. Just ssh or telnet to the remote machine and execute:

    rmate -H textmate-host test.txt
    

    To secure your TextMate, rmate supports SSL secured connections and client certificate
    authentication. See the section “SSL secured client connection” below for details.

    SSL secured client connection

    This version of rmate supports SSL secured connections and client certificate
    authentication. For this to work it’s recommended to configure TextMate to
    allow connections from local clients only.

    Next you must install a proxy supporting SSL -> non-SSL connections like stunnel
    or haproxy on your Mac. Details regarding this would be to much for this documentation.
    For haproxy there is an example configuration available.

    Have a look at this
    excellent tutorial for details on how to create self-signed SSL server certificates and
    certificates for client side certificate authentication.

    Create a PEM file of the client certificate by merging the client certificate and the
    client certificate key file, for example:

     cat ca.crt ca.key > ca.pem
    

    Copy the resulting client certificate file over to the machine you have installed rmate
    on and you want to edit files from.

    To enable SSL encrypted connections, the --ssl flag needs to be specified as argument
    for the rmate command. Additionally the --cert ... flag needs to be specified if
    client side certificate authentication must be used. To verify the SSL server certificate
    on rmate side, you can additional specify the --verify flag. This flag should be ommited
    when using self-signed certificates.

    Optionally the flags can be configured in the rmate configuration file, similar to host
    and port settings:

    ssl=yes
    ssl_cert=file
    ssl_verify=yes
    

    Note that the ssl_verify setting should be omitted when using self-signed certificates.

    Example

    Example session: Editing html file located on an SGI o2: https://github.com/aurora/rmate/wiki/Screens

    Usage

    Edit specified file

    $ ./rmate [arguments] file-path
    

    Read text from stdin

    $ echo "hello TextMate" | ./rmate [arguments] -
    

    Arguments

    -H, --host HOST  Connect to HOST. Use 'auto' to detect the host from
                     SSH. Defaults to $#.
    -p, --port PORT  Port number to use for connection. Defaults to $#.
        --ssl        Use SSL encrypted connection.
        --cert FILE  Certificate file (PEM format) for client side certificate
                     authentication.
        --verify     Verify peer for SSL connection.
    -w, --[no-]wait  Wait for file to be closed by TextMate.
    -l, --line LINE  Place caret on line number after loading file.
    -m, --name NAME  The display name shown in TextMate.
    -t, --type TYPE  Treat file as having specified type.
    -n, --new        Open in a new window (Sublime Text).
    -f, --force      Open even if file is not writable.
    -v, --verbose    Verbose logging messages.
    -h, --help       Display this usage information.
        --version    Show version and exit.
    

    Default parameter configuration

    Some default parameters (host and port) can be configured by defining them
    as the environment variables RMATE_HOST and RMATE_PORT or by putting them
    in a configuration file. The configuration files loaded are /etc/rmate.rc
    and ~/.rmate.rc, e.g.:

    host: auto  # prefer host from SSH_CONNECTION over localhost
    port: 52698
    

    Alternative notation for configuration file is:

    host=auto
    port=52698
    

    The precedence for setting the configuration is (higher precedence counts):

    1. default (localhost, 52698)
    2. /etc/rmate.rc
    3. ~/.rmate/rmate.rc
    4. ~/.rmate.rc
    5. environment variables (RMATE_HOST, RMATE_PORT)

    Disclaimer

    Use with caution. This software may contain serious bugs. I can not be made responsible for
    any damage the software may cause to your system or files.

    License

    rmate

    Copyright (C) 2015-2017 by Harald Lapp harald@octris.org

    This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

    This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

    You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.

    Visit original content creator repository
    https://github.com/aurora/rmate-nim

  • whatsnew

    whatsnew

    Check for new github releases of your Golang application 🎊

    Go Reference Enterprise Ready GitHub tag Build Status BSD license codecov Go Report Card


    whatsnew provides a simple way to check GitHub for new releases of your Go application. It saves results between runs, uses etags to speed up responses, and tries to minimize the overhead it adds to an otherwise fast application CLI run.

    If caching to disk or reading from GitHub don’t work for you, you can customize the behaviour.

    Quick start

    import (
      "context"
      "github.com/jbowes/whatsnew"
    )
    
    func main() {
    	ctx := context.Background()
    
    	// Start a whatsnew Check
    	fut := whatsnew.Check(ctx, &whatsnew.Options{
    		Slug:    "you/your-app",
    		Cache:   "testdata/update-cache.json",
    		Version: "v0.0.1",
    	})
    
    	// Run your CLI code and whatnot
    
    	// Wait for the Check to complete, and show the results
    	if v, _ := fut.Get(); v != "" {
    		fmt.Printf("new release available: %s\n", v)
    	}
    }

    For more usage and examples, see the GoDoc Reference

    Alternatives

    whatsnew only checks for releases. If you’re looking for a package that will let your application update itself, or you prefer packages that start with go-, consider one of these:

    Contributing

    I would love your help!

    whatsnew is still a work in progress. You can help by:

    • Opening a pull request to resolve an open issue.
    • Adding a feature or enhancement of your own! If it might be big, please open an issue first so we can discuss it.
    • Improving this README or adding other documentation to whatsnew.
    • Letting me know if you’re using whatsnew.
    Visit original content creator repository https://github.com/jbowes/whatsnew
  • trkr

    trkr

    trkr is a command-line tool to help track what our team is working on and
    how long they work on it. With Trello and Google Sheets integration, it allows
    you to search and select a related Trello card, and writes directly to a Worksheet.

    Requirements

    Python 2.6+ or 3+

    Installation

    pip install trkr

    Usage

    Using trkr is as simple as running trkr run. A list of valid command is available
    by running trkr --help.

    Description

    Provide a description of the work accomplished. If none is provided, the commit
    message of the last commit at HEAD will be used.

    Minutes Worked

    Time worked in minutes (must be a valid integer).

    Trello Card

    You have the choice to (i)nput a card’s URL, fuzzy (s)earch for a card, (p)ick from
    a list of your assigned cards, or choose to (n)ot include a card.

    Date

    Enter a timestamp in the MM/DD/YYYY format. This can be skipped, and will instead
    use the current date.

    Setup

    To start the setup script, run trkr setup. It will ask for an email, Trello
    API keys, and the worksheet URL. All settings are saved at ~/.trkr/config.json,
    and can be modified at a later time.

    Trello Authentication

    Finding Client ID and Board ID

    On Trello, navigate to a board and append .json to the URL. It should look
    something like https://trello.com/c/<url>.json. When the JSON data has loaded,
    hthe first id will be the Board ID; copy and save it somewhere.

    Search next for your name or username, find the id associated with it, and
    save it somewhere; this is your Client ID.

    API Keys and Token

    Trello API keys can be found at trello.com/app-key.
    The hash found under Key is your API Key, and the one under Secret is your
    API Secret. A Token can be generated by clicking Token on the same page; this
    is your Token.

    Google Sheets Authentication

    Once you’ve created a new Google Sheet, it’s URL is the Document URL, and the
    name of the sheet at the bottom is the Worksheet Name.

    In order to authorize trkr to read/write from a worksheet, follow the steps
    layed out in Authorizing pygsheets.
    With the JSON file in hand, move it to ~/.trkr/keyfile.json. Lastly, share
    the worksheet with the email found in keyfile.json. trkr should now be setup
    and ready to use!

    Acknowledgement

    trkr relies on the great work done by the pygsheets
    and py-trello teams.

    Visit original content creator repository
    https://github.com/D3MNetworks/trkr