Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

One, ImageHelper introduction

ImageHelper (originally called AFImageHelper) is a class library for processing pictures written in Swift language, through the extension of UIImage and UIImageView. Make it increase the compression, color, gradient, cropping and other operation methods of the picture, and support the use of cache to get pictures from the website.

Second, the configuration of ImageHelper

(1) Download the latest code from GitHub, address: https://github.com/melvitax/ImageHelper

(2) Add ImageHelper.swift, ImageVIewExtension.swift to the project

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

Three, UIImageView extension related usage examples

1. Let UIImageView support to get pictures directly through URL address

(1) You can set whether to cache pictures (the default is to cache). If yes, every time a network request is made, it will first automatically determine whether there is a cached image locally, and if there is, use the cached image directly. The newly loaded pictures will be cached for next use.

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

imageView1.imageFromURL(“http://www.hangge.com/blog/images/logo.png”, placeholder: UIImage())

(2) Support setting placeholder pictures (placeholder), which will be displayed when the network pictures are not loaded.

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

imageView1.imageFromURL(“http://www.hangge.com/blog/images/logo.png”,

placeholder: UIImage(named: “loading”)!)

(3) You can also set whether the picture needs to be faded in after loading (the default is true).

1

2

3

imageView1.imageFromURL(“http://www.hangge.com/blog/images/logo.png”,

placeholder: UIImage(named: “loading”)!,

fadeIn: true)

(4) You can do some follow-up processing in the callback function of the successful image loading.

1

2

3

4

5

6

7

imageView1.imageFromURL(“http://www.hangge.com/blog/images/logo.png”,

placeholder: UIImage(named: “loading”)!, fadeIn: true, shouldCacheImage: true) {

(image: UIImage?) in

if image != nil {

print(“Picture loaded successfully!”)

}

}

Four, UIImage extension related usage examples

1. Get the picture through the URL address

Like UIImageView, UIImage can also get network pictures via URL. It also supports placeholder images, image caching, and callback after loading.

(Actually, the UIImage.image(fromURL: url) method is called internally when loading the url image of UIImageView)

1

2

3

4

5

6

7

let url = “http://www.hangge.com/blog/images/logo.png”

UIImage.image(fromURL: url, placeholder: UIImage(), shouldCacheImage: true) {

(image: UIImage?) in

if image != nil {

self.imageView1.image = image

}

}

2. Generate UIImage by color

(1) Use solid colors

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

UIImage(color: UIColor.orange, size: CGSize(width: 55, height: 30))

(2) Use linear gradient

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

let gradientColors = [UIColor.orange, UIColor.red]

UIImage(gradientColors: gradientColors, size: CGSize(width: 55, height: 30))

(3) Use radioactive gradient colors

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

3

UIImage(startColor: UIColor.orange, endColor: UIColor.red,

radialGradientCenter: CGPoint(x: 0.5, y: 0.5), radius: 1,

size: CGSize(width: 55, height: 30))

3. Cover the picture with a layer of gradient

Let’s add a translucent yellow-brown gradient to the original UIImage to give the picture the effect of adding old photo filters.

(The default blend mode is CGBlendMode.Normal, it can also be set to other blendMode)

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

3

let gradientColors = [UIColor(red: 0.996, green: 0.769, blue: 0.494, alpha: 1.0),

UIColor(red: 0.969, green: 0.608, blue: 0.212, alpha: 0.2)]

imageView2.image = UIImage(named: “beach”)?.apply(gradientColors: gradientColors)

4. Generate UIImage from Text

In addition to setting the text content, you can also set the text size, text color and background color.

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

3

4

5

6

7

8

9

10

//Assuming the UIImageView size on the interface is 180*80

let textSize = 46 * UIScreen.main.scale

let imageWidth = 180 * UIScreen.main.scale

let imageHeight = 80 * UIScreen.main.scale

if let image = UIImage(text: “hangge”, font: UIFont.systemFont(ofSize: textSize),

color: UIColor.white, backgroundColor: UIColor.orange,

size: CGSize(width: imageWidth, height: imageHeight)){

imageView1.image = image

}

5. Take a screenshot of any UIView object (Screenshot)

1

2

3

//Convert the current page to image

let image = UIImage(fromView: self.view)

imageView2.image = image

6. Alpha layer

1

2

3

4

5

//Determine whether the picture has a transparent layer

UIImage(named: “logo”)?.hasAlpha

//Add a transparent coating to the picture

UIImage(named: “logo”)?.applyAlpha()

7. Add a margin to the picture (add a transparent border)

The size of the upper and lower imageViews in the following example are the same, and the styles are both Aspect Fit. The image used below has a transparent margin added.

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

imageView1.image = UIImage(named: “beach”)

imageView2.image = UIImage(named: “beach”)?.apply(padding: 50)

8. Picture cropping

(1) Customize the cutting position and size

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

3

4

imageView1.image = UIImage(named: “beach”)

let rect = CGRect(x: 0, y: 0, width: 500, height: 200)

imageView2.image = UIImage(named: “beach”)?.crop(bounds: rect)

(2) Automatically cut into a square

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

imageView1.image = UIImage(named: “beach”)

imageView2.image = UIImage(named: “beach”)?.cropToSquare()

9. Adjust the size

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

imageView1.image = UIImage(named: “beach”)

imageView2.image = UIImage(named: “beach”)?.resize(toSize: CGSize(width: 300, height: 400))

Since different devices have different scaling ratios, you can dynamically set the size by multiplying the screen scale by a fixed width and height to ensure that the display is normal under various devices and will not be blurred. (This method can also be used for padding and borders settings)

1

2

3

let width = 300 * UIScreen.main.scale

let height = 400 * UIScreen.main.scale

let image = UIImage(named: “myImage”)?.resize(toSize: CGSize(width: width, height: height))

10. Generate rounded corners or round pictures

(1) Picture with rounded corners

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

imageView2.image = UIImage(named: “beach”)?.roundCorners(cornerRadius: 70)

(2) Rounded corner picture with border

Original: Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

1

2

imageView1.image = UIImage(named: “beach”)?.roundCorners(cornerRadius: 70, border: 200,

color: UIColor.orange)

(3) Circular picture

Original: Swift-Picture Processing Library ImageHelper Detailed (

Please indicate:Free Editor Online Photoshop » Swift-Picture Processing Library ImageHelper Explained (Extended UIImage, UIImageView)

comment (0) share it ()

comment Grab the sofa

Login to comment! If you already have an account, please first log in,No please registered or

Photo editing search facebook back to the homepage