Advertising

Welcome to SparkyLinux forums
Zapraszamy również na polsko-języczne Forum https://forum.linuxiarze.pl

Need Basic Instructions to Sparky Rescue 4.1

Started by oldefoxx, May 12, 2016, 05:45:07 PM

Previous topic - Next topic

oldefoxx

A little quick backgound so that you know where I am coming from and that this is no joke:
Retired Systems Engineer/Analyst.  Trained on 7 military computer systems, 3 digital switching systems, networking, workstations, routers, bridges, and several programming languages.  Selt-taught on PCs, Assembler, DOS, Windows, several BASICs, a bit of C/C++, databasing, analysis of network traffic, and more.

Personally dropped Windows in 1998 as too unfit for real use, too vulnerable to attacks, and a rip-off in that you bought static code, and all real enhancements were targeted towards a new future release that you would have to buy to replace what you already had that was working.  Faked planned obsolescence in file formats, lack of new features, deliberate drop in bug fixes and patches for existing code to force prople and businesses to upgrade was reminiscent of the 50s anf 60s when Detroit used similar tactics to force people to replace their cars every 3 to 5 years, but their they used real obsolescence by inferior product design as well, so that parts would not last.

Windows was never secure and never meant to be.  It was overly documented as a way to persuade programmers and 3rd party developers to write code specific to Windows, and by adding more features to the new forthcoming version of Windows, was able to influence these people to do the same, to write code that required the buyers to by upgrades of their products as well.

All that documentation and openness gave hackers all they needed to learn how to get into these systems and do their worse.  So you get free protection software or susbcribe to yearly contracts to get weekly signature updates.  By mixing and matching such products at work, I could achive perhaps 98% detectipn of the known malware at the time.  Not all could ve eliminated, meaning the systems had to be purged and reinstalled.  Most hackers are not personnally this knowledgeable, but they create and share hacker kits among themselves,  This is where signatures come into play, as the kits tend to be of a type with the code they generate.

But all this effort just to get to possibly 98% effectiveness, and that was just a guess, and all of it after the fact, and for it to be really effective you had to have some of it permanently in RAM taking up processor time, plus engage in full disks scans, not just recent file changes, because new signatures might identify hidden malware that was already embedded in your system.  All this meant bad news all around.  So I went over to linux on my own.  And that solved all problems for over a decade.

But no longer.  I began with Knoppics, then decided to try Mandrake, then was persuaded by an article to give Ubuntu a shot, so moved to Ubuntu 8.04.  That was a good place to be, and I followed Ubuntu to 10.04.  But a brain injury in 2008 put a break on PC use for a few years as my body attempted to adjust and recover, and I remain on several mainenance drugs and my sense of balance has never returned, so I get around with the aid of a walker.

Finally motivated to get back into PCs and to fill in my time, and to learn the truth about Obama by getting on the Internet, I got the keyboard in my lap, and after a few weeks, got enough mobility back in my hands and fingers to do a bit of typing, I decided to check first for the latest in Ubuntu.  That was 14.04 LTS.  I worked out how I should be able to keep my user account or accounts a<nd just replace the system folders and files, and found that under the Install options I could use "Something else" and keep just the non-system folders and contents intact.  I wrote my process up as a thread, and was given a single command for eliminatimg all folders and files other than /m*/*/*/home.

This command assumes that you've used the LiveCD "Try" option and only mounted the partition to be overwritten in the Install step.  You enter terminal and type "sudo rm -r /m*/*/*/home/../[!h]*".  It's a dangerious command so must be exact.

Expanding on some of the wildcards makes it more specific and less risky, but as it stands, it allows for any distro that mounts drives by /mnt or /media, and user name (normally a shorthand version of the distro name), any mounted device as long as it has a "home" folder at that level, from which it retreats and deletes everything except folders and files that begin with "h". meaning leave home itself.  Instead of the 2nd "*" in this command you can use $USER.

If more than one device is mounted, you  can put enough ot the sougnt device mounted name to make it unique (and one or more "*" included to fill in for the rest) where the second "/*/" appears above.

Some might think this step is unnecessary, and it may be, unless you are attempting a reinstall.  Unfortunately the Installer may assume you want an upgrade to what is already there, but may mishandle it and cause problems.  The order of upgrades is important when they happened, but the Installer will attempt to do it in the order of Repositories queried, and that can prove to be critically wrong.  You can't stop this effort, so you wipe out everything but /home and begin from scratch with manual additions.

The problem is, Ubuntu began to go wrong in terms of partition and drive management.  I could induce failures with specific acts at certain points, which I reported as bewst I could,  No action was taken on these reports.  Trying to post warning threads at ubuntuforums.org and askubuntu.org got me criticized and essentially banned from both sites.  Nobody cared enough to follow up on my reports.  I stayed away from what I knew would cause the failures, and hoped I would not trip over another cause.  When the support for 14.04 LTS began to drop away as it neared end of life (this coming August), I decided to move to Ubuntu 15.10

Almost immediately 15.10 showed partition and drive corruption issues.  Only know they happened on the dly, and I had no ifea what was causing it.  I spent most of my time in data rescue, stripping the drives, and minimizing software installs to see if I could isolate the cause.  That did not work, so I dropped back to the LiveCD for 14.04.  But run from the LiveCD long enough and do much with the partitions, they would corrupt again.  So I started looking for rescue and restore disks that were Linux-centric.  I had no assurances that any that were Window's based were really up to the task of dealing with Ext4 partitions.  There were few.

Finally, in desperation, knowing that the problem was in Ext4 itself, I resolved to go back to Ext3.  But there is no separate Ext3 code any more, as Ext4 does it all for Ext2, Ext3, and Ext4.  As soon as Ext4 formatted my first Ext3 partition, it went into failure mode with fsck reporting endless errors in the partition.  It ran 3 days befpre I shut it down, looping on the same errors and occasionally reporting new ones.  I could not take a snapshot. stop it, and the keys did nothing.  I finally just killed power.  So anything Ext? was out.  What were my other options, and could I even trust 14.04 anymore in other regards.

I elected to try other Ubuntu distros, meaning Mint, Mate, and Ubuntu-Gnome.  Mint and Mate were hard to nail down on doing things from the desktop, but Ubuntu-Gnome was on 16.04 and easier to follow.  Would this be enough?  It had a few things that were buggy at the start, but showed promise, until I tried the basic stuff.  It had only three entries in /etc/apt/sources.list, so you could install or source very little, and could not get gpart or any other recovery tool of note.  It offered the three Ext? types, plus FAT32, NTFS, XFS, JFS, and a few not common.  However, it's ;ineup of supported types does not agree with tools like gparted.  For commonality, you are reduced to 7 choices, including the Windows ones.

Turns out. Ubuntu-Gnome does none of these types right, and admits the XFS type is experimental.  But it can't even format JFS correctly to a point that it can recognize it itself, and it also fails completely with the others, but not as fast, except for NTFS which it can't do at all.  The FAT32 fails shortly, and it should be the most easily mastered type of all outside Ext2.

I have downloaded and burned ubuntu 8.04 to 12.04 to DVD-RW disks, skilling the ??.10s entirely.  I hpe to get back to a point where the drivers worked.  I fault designers and maintainers for adding or deleting fratures without fully understanding what they had before.  An example is someone removing --configure from dpkg, and nobody documenting what the results of running dpkg -V mean.

I learned the developers and maintainers each so their own thing per distro, so what works one way in one distro does not hold for the next, unless they decided to begin at that point and go on from there,  I also learned that you cannot lock in to a specific release of a package, so that it I found and older version that appeared to be working right, the software updater process would override it with later changes, unless you did no upgrades at all.  I also learned that there are these issues:

(1)  dpkg assumes responsibility for package management once installed, but makes no record of what you have in place.  So it can't revert to what you had if there is a ptoblem with any upgrade.  Nor will it let you go back to your last stable configuration.  You have to try to do all that with backups and restores of just your system files, excluding things like /proc /tmp. /mnt /media *\/Trash or *\/trash /boot/grub/grub.cfg /dev /home and others.

(2)  There is no way I am aware of that you can hold any package to a specific release or revert to an earlier release.  You can't cross distro lines for specific packages except by generalizations in /etc/apt/sources.list or specific *.list files in /etc/apt/sources.list.d or setting up a PPA for them, then making manual changes to these when you don't want them to be used in the future.  Again, going back is not an option,

(3)  There is no white listing or black listing of packages by users, except in some stars that appear in software-center.  Who knows where they came from, and you can't read the reviews either.  The idea that a user has to know the exact place or packages involved in a point of failure is stupid.  The failure happened, and i would be sure to warn others against buying their crummy junk.  Consider yourselves warned.

Meanwhile there is nothing to provide a guide to working with Sparky Rsecue, and it is also seriously lacking in entries in /etc/apt/sources.list and under /etc/apt/sources.list.d/*.list  Why not treat the rescue distro as a full blown Linux distro by adding more entries here?  I believe every LiveCD should go out prepared to so data rescue and partition rebuilds on drives, and to verify every write to a drive as to its integrity and placement.  I just ran testdisk on my internal drive, and it was really trashed, as superblocks from old partitionings were also brought forward.  I had multiple overlapping partitions, and had to learn how testdisk would finally let me recover 3 that did not share overlapping boundaries.  There needs to be a tool to wipe out all superblocks that do not match the new format and structure,

I want a 2 of three best guess estimate of what is there when the recovery is done, or maybe a 3 of 4 or even 3 of five estimate of the files and contents to use to get the best results.  Nothing is set up that way.  If there is disagreement, maybe a chksum check would help, or if a test file. a view of the contents of each one can be used to determine which, if any, of these to go with,  Maybe if possible duplicate files is indicated by name and size, the contents can be compared before deciding the fate of either.  This comparison could be in different folders or across different drives and partitions, even different machines.

View the most recent posts on the forum